by community-syndication | Aug 21, 2006 | BizTalk Community Blogs via Syndication
This of course is something I believe pretty intensely having been a trainer for the
last 10 years or so. Scott and Brian (and Tomas has
sort of piled on) have posted a couple of entries on how they thing Windows Workflow
Foundation (WF) might be too complex. In general I think they are pretty much
totally wrong (isn’t disagreement and discourse the cool think about the web? 😉
). First I will address their particular points directly – and then give my
overall assessment of WF. And, in full-disclosure, although sometimes people
mistake me for an MS employee – I am *not* .
First Brian’s points (not to pick on him – but since he was first to post I’ll
respond to his main points first):
1) On properties. So here Brian has an interesting point that events are
displayed in the properties grid in a way that are not segregated from “properties”
in the way other .NET objects properties are. But the thing he misses is that
*most* events are (and should be) DependencyProperties. Because they are DependencyProperties
they can be bound using ActivityBind, and from that POV they do really belong in the
same part of the property grid (since you should be binding an Event of one Activity
to another Activity – you aren’t using the *normal* += syntax when binding those two
objects together). This point is pretty minor IMO and so is my response.
When writing WF properties and events there really isn’t much difference in the syntax.
2) Code Conditions. So why do CodeConditions have an EventArgs? VB.NET. VB.NET
cannot deal with delegates who have a return value (unless this has been fixed and
I didn’t know about it). Now in generally he misses the point here as well.
In *most* real WF applications – you don’t want to be using CodeCondition, you want
to be using DeclarativeRuleCondition since you’ll want the flexibility of the WF rule
execution rather than hardcoding conditions in code. Also – since the *preferred*
model of WF is to have as litle code as possible in your Root workflow (which
enables more dynamic scenarios as well as XAML creation of workflows) – using CodeCondition
is really just for demos and such IMO.
3) HandleExternalEvent/CallExternalMethod. Granted, communication between the
Host and running workflows isn’t perhaps the best part of WF. But the barrier
is there for a good reason – because the model of WF supports persisting workflow
instances. If a reference to an object could be passed directly into a workflow
instance, that could cause issues when using persistence. Now – Is using HEE/CEM
complex? Perhaps at first, but once you get used to it – it really isn’t
all that complex – *and* generally on a particular WF project you’ll have your interface
and types defined pretty early in the process and then like magic – you
are done and can get on to writing other Activities.
Also – how much more complex is that than defining a WCF (Indigo) ServiceContract
interface and corresponding message types, and the configuration entries for the bindings,
etc. etc. etc.? I think it is just about as complex – which really tells me
it is about as complex as it needs to be to be generic.
Now – you also have to remember that HEE/CEM is just *one* way to communicate between
the host and the workflow instances. The real communication mechanism (that
HEE and the ExternalDataExchangeService use) is the WF Workflow Queuing mechanism.
So if HEE/CEM is too complex or not complex enough (which is actually what I’ve run
into a number of times) then you can create Activities that listen for application
specific queues and create services that Activities can ask for to communication to
the Host. The big thing to remember is that this indirect communication is essential
for the WF model to succeed.
Also – he fails to mention the ability (in a very simple workflow) to pass parameters
into a Workflow Instance and get parameters back out. That is probably the mechanism
you’d use in a “WF-lite” kind of WF usage.
Now on to Scott:
Scott doesn’t have complaints really (ok a few about the WF designer – I’m leaving
those alone for brevity) – but he really has a list of “gotchas”. Now – I would
argue in return that every runtime (Java, .NET, ASP.NET, WCF, .NET Remoting, BizTalk)
all have “gotchas” – which generally related to understanding the model of that particular
runtime or library. That being said – here are my responses to his points:
1) Spawned execution contexts. These are really super important in terms of
the model. What part of the model? Compensation for one. If
each child inside of a While Activity didn’t have a persistable context – then it
would be impossible to come back to that activity at some time in the future (could
be years – that is what the model is meant to support) and tell that activity to compensate,
since that activity would have no state to remember what to compensate. Also
in persistence it is vital to remember all the activities that have executed.
So – yes in general you need to be really careful that your Activities are all Serializable.
This is true in other runtimes as well (like when you store objects in out-of-proc
Session in ASP.NET or work in .NET remoting – so it really isn’t anything new for
most .NET developers).
2) See #1
3) So this is something I’ve argued about on the WF forums – how is this different
than any random .NET code that uses System.Transactions? It isn’t. If
you use more than one connection to SQL Server 2005 you get a DTC transaction.
End of story – happens in C#, VB.NET, ASP.NET, WCF and WF. So the issue here
is that if you using the *Out-of-Box* Tracking and Persistence service – *and* connecting
to a database you get DTC transactions. Just like if you used three objects
in a .NET library and all of them used different connection objects. The OOB
Tracking and Persistence and supposed to be references to get your WF application
started – and if your application works with them – super , use the OOB implementations.
How to get rid of the DTC? Build your own Tracking and Persistence service that
uses a common connection (like they do if you configure it) with your own code and
you now get Local SQL Server transactions – magic if you understand the model.
So what’s the real upshot here? Are Scott and Brian right or am I right?
I think I’m right of course 😉 And here’s why – I think I understand the WF Model. Coming
from BizTalk has made me understand the power of this model (since BizTalk orchestrations
have the ability to do many of the same things WF workflows can do). Design-time
and runtime visibility, the ability to model many different kinds of short and long-running
processes (I can go on and on about the features) – are IMO really powerful ways
to model real world processes.
You have to remember the charter of the WF Team – they aren’t just building a
visual way to write random .NET code – they are creating a way to write applications
that are workflow enabled, that need all or some of the potential services that the
WF model provides. The workflow runtime is based on a certain set of assumptions
about how applications should be put together (although almost all of those
assumptions are pluggable pieces of the infrastructure that you can change if you
like).
Perhaps your application won’t do well with the model that WF provides.
But I think with more and more people writing services – there is going to be a big
need to tie those services together (not to mention all the applications that people
write today which really are workflows whether people realize it or not). And
I think WF is going to be proven to be the best way to write those kinds of applications.
Is WF easy? No – I do not think WF is easy. If it was easy it would hardly
be powerful enough to be very useful.
Flame away 🙂

by community-syndication | Aug 21, 2006 | BizTalk Community Blogs via Syndication
Remember, when you apply maps in a receive port; the maps are applied after the pipeline process. On the way out, the direction is opposite.
…(read more)
by community-syndication | Aug 21, 2006 | BizTalk Community Blogs via Syndication
So I am currently architecting and developing a BizTalk 2006 SAP Integration solution for a customer. This is really great stuff. One of the key designs of this solution was using SSO to store SAP configuration data along with other custom data. In my opinion if you are thinking about using configuration data inside of BizTalk you must take a serious look at SSO as a data store. One of the gems out there to get someone up to speed very quickly is two samples from MS, get them here. The ones you want to look at are the ‘Enterprise Library 2.0 with BizTalk Server’ and ‘SSO as Configuration Store’. The former one includes the design and runtime assemblies that extend the configuration provider allowing you to use SSO as a configuration store while using Enterprise Library (or someone else’s product) which I feel solved the problem of why developers never used the SSO in the first place. I’ve been to many customers that are currently using BizTalk and because maybe they didn’t understand the ins and outs of SSO, they ignored it completely. On a little side note, if you want a good place to start learning about SSO go to Richard Seroter’s blog about SSO here. He’s a great talent in the BizTalk space. Now where was I? Oh yes so the second sample is a great introduction on actually using the tools provided out of the box to enable you to create an ‘Affiliate Application’. What is an ‘Affiliate Application’ you ask? Basically the AAs are logical containers that represent a system use what to use SSO with (SAP in my case). Now AAs can be grouped into several types (of course) and those types are Individual, Group and Host Group.
For this article I’ll be focusing on Individual types which offer a 1:1 mapping between a windows based account and non-windows account (e.g. SAP account). Now since SAP uses basic authentication credentials (username and password) so the mapping is quite easy.
Now in order to enable SSO you can use the command line utility SSOManager.exe -enablesso as SSO is not enabled by default. Now the easiest way to send data into SSO is by using an XML document and using the SSOManager.exe -createapps this will tranverse the XML document and create an AA and key-value pairs. Here’s a sample XML file to send in:
<?xml version=”1.0″?>
<sso>
<application name=”AppMgmtUsingSSOApp”>
<description>AppMgmtUsingSSOApp</description>
<appUserAccount>BizTalk Application Users</appUserAccount>
<appAdminAccount>BizTalk Server Administrators</appAdminAccount>
<field ordinal=”0″ label=”reserved” masked=”no” />
<field ordinal=”1″ label=”connectionString” masked=”yes” />
<flags configStoreApp=”yes” allowLocalAccounts=”yes” enableApp=”yes” />
</application>
</sso>
Notice the 0 ordinal being reserved, this must be set as so, or an exception is thrown.
Now MS always gives you at least ten different ways to do the same thing. So another way to enter your data in the SSO data store you need the SSO Client Utility that comes with the Enterprise SSO bits. If you look at the BizTalk help files and search for ’How to install the SSO Client Utility’ you’ll find the steps involved. Developers can also programmatically read and write data to the SSO store using the SSOConfigStore object. Here’s a sample for reading and writing configuration to the store:
public static string Read(string appName, string propName)
{
try
{
SSOConfigStore ssoStore = new SSOConfigStore();
ConfigurationPropertyBag appMgmtBag = new ConfigurationPropertyBag();
((ISSOConfigStore) ssoStore).GetConfigInfo(appName, idenifierGUID, SSOFlag.SSO_FLAG_RUNTIME, (IPropertyBag) appMgmtBag);
object propertyValue = null;
appMgmtBag.Read(propName, out propertyValue, 0);
return (string)propertyValue;
}
catch (Exception e)
{
throw;
}
Notice the use of the ConfigurationPropertyBag object which is just a class that inherits from IPropertyBag which signature looks like this:
Now in term of using this data store to hold SAP data mappings, first we need to find out what SAP needs in addition to just username and password to gain access to the system. Here’s a list of keys that we need to set in our key-value pairs:
- SAPInlineHostName
- SAPInlineClientNumber
- SAPInlineSystemNumber
- SAPInlineUserName
- SAPInlinePassword
So once you setup these key-value pairs, you can configure your adapter by selected the AA from a dropdown of available AAs in the configuration window thus not having to fill out username and password on that screen. To the adapter the AA is mutually exclusive to a UserName and Password pair. Internally after looking at the adapter code, the adapters make a call to the SSO with a ValidateAndRedeemTicket method passing in the message itself and the AA information and getting the credentials in the form of a string[] object in return.
I guess my point in this article is just to bring awareness to SSO because as I look out onto the BizTalk landscape I’m not seeing customers using it (1) to its full potential and (2) at all and I find SSO as a value add to most solutions. So please get out there and download the samples and enjoy. I’m sure to be posting more about SSO in the near future so stay turned.
by community-syndication | Aug 20, 2006 | BizTalk Community Blogs via Syndication
When using XML as your fact, if the namespace changes, make sure you update your rules. The easiest way may be to export the rules (via BizTalk Administration Console > Applications > Assembly > Policies) and do a search and replace on the namespace.
by community-syndication | Aug 19, 2006 | BizTalk Community Blogs via Syndication
In an effort to expand on this entry, while testing, there are many times where I have to use the mllpreceive.exe to capture data.
I have made it easy by adding right click functionality.
I created a batch file called Receive.bat with the following contents:
MLLPRECEIVE.EXE /P 12000 /SPLIT /SB 11 /EB 28 /CR 13 /D %1
I then […]
by community-syndication | Aug 18, 2006 | BizTalk Community Blogs via Syndication
Anthony Pounder asked me if you can do conditional compilation in Biztalk orchestrations. The answer is yes, but you have to edit the XLANG/s in the .odx file directly. This is not functionality that is surfaced in the orchestration designer.
I wrote an article on XLANG/s a long time ago when I was still fairly new to BizTalk, and at the time I thought it would probably be a useful technique to be able to edit XLANG/s textually. It turns out that almost all the XLANG/s functionality is surfaced in the orchestration designer. This is one feature which isn’t. BizTalk orchestration currently provides two DSLs (Domain Specific Languages) layered on top of each other. One is the graphical orchestration designer which creates ODX (Orchestration Designer XML), and the other is the XLANG/s language. A long time ago (2001) Microsoft gave some consideration to making XLANG/s into a ‘first-class’ .NET language. However, this never happened. Instead, the worth of XLANG/s has been largely removed by the graphical orchestration designer (which did not exist in 2001). There is almost no ‘abstraction space’ between the two DSLs – i.e., they are almost identical in what they do and the abstractions they offer. By definition, the worth of a DSL is in terms of the abstraction space between it and the next layer down (generated C# in this case). So, for BizTalk, one of the two DSLs is almost wholly redundant. In this case, the obvious candidate for redundancy is XLANG/s, and sure enough, we do not expect it to survive into the planned 2008 ‘nextgen’ version of BizTalk which will use Windows Workflow Foundation.
In the meantime, there are a couple of features which XLANG uniquely offers, and conditional compilation is one of them. To use it, you need to edit the .odx file directly in a text editor. The XLANG/s module contained in this file uses conditional compilation to include the ODX XML in the .odx file. Note how inconsistent the use of the .odx extension is. This is really an XLANG/s file that contains ODX, and not the other way around! You can use #define to define a conditional constant, and then use #if just like C# to conditionally compile sections of the XLANG/s. However, once you have created your constant, always enter #if and #endif in the expression editor in BizTalk, rather than directly in your .odx file. The code you see in the expression editor is sourced from the ODX XML created by the designer and expression editior, rather than directly from the XLANG/s, so if you amend the XLANG/s directly, you can’t see you changes in the designer, unless you find the corresponding XML content, and amend that to be identical. This is one of the main reasons why I cannot recommend editing XLANG/s directly.
If you are determined to do this, I would strongly recommend that you don’t try to use conditional compilation except in expression and message assignment code. I have experimented a bit, and this does work, but there is no way to reflect this in the ODX, and therefore you loose sight entirely of this in the orchestration designer. Apart from anything else, this would be a reckless thing to do in terms of code maintenance.
<FONT face=Ver
by community-syndication | Aug 17, 2006 | BizTalk Community Blogs via Syndication
Hello Everyone,
My name is Sreedhar Pelluru and I am a Programmer Writer with BizTalk Server team here at Microsoft. I own the content for Business Rule Engine (BRE) at the time of this posting. The reason for creating this blog is to evangelize about BRE. You will also see postings from our BRE development team on this blog periodically. I am planning to blog on basic and advanced features of BRE and walkthroughs that use BRE features. Please feel free to send me any feedback on BRE content in BizTalk documentation or any posting on this site.
Thanks & Regards,
Sreedhar Pelluru
MCSD .NET, MCP (BizTalk 2006)
by community-syndication | Aug 17, 2006 | BizTalk Community Blogs via Syndication
The first in a series of papers addressing developing code for BAM solutions is now live at
http://download.microsoft.com/download/b/1/d/b1d9ddf9-88c6-4d4e-abea-4787fdc85bec/DevelopingWithBAM101.exe
This paper covers the basics of using EventStreams to instrument your existing business processess.
by community-syndication | Aug 17, 2006 | BizTalk Community Blogs via Syndication
Now that the Commerce Server development team has shipped their flagship product, they’ve started to publish some great new posts about using Commerce Server 2007. If you don’t already subscribe to these blogs, you really should!
Nihit Kaul [MSFT] – SDET (Super Dude Especially Testing) has posted a bunch of great tips for working with Commerce Server 2007. Nihit is a great guy and accomplished developer/tester who frequents the microsoft.public.commerceserver newsgroups, helping out folks with their CS development issues.
CS 2007: Things you didn’t know about the Customer and Orders Manager UI
CS 2007: Running Pipelines in a Console Application
CS 2007: How to display a custom user profile definition in the Customer and Orders Manager?
CS 2007: Where is my CreditCardNumber?
Alan Faulkner [MSFT] – SDET (Another Super Dude and Great Proof Reader) has also been busy with some really good posts explaining the inner workings of Commerce Server 2007. If you liked my recent TechNet article about using the new Orders Adapter, thank Alan. He was kind enough to proof read it during the early stages!
Error Connecting to Sql Server when running Data Warehouse Import Wizard
How to generate trace output for the Commerce Server Adapters
Commerce Server Adapter Message Schemas Shipped in SDK
Why isn’t my Commerce Server Events recorded in Data Warehouse Analytic Reports?
Order Adapter’s Status Values
Vinayak Tadas [MSFT] – CSCG (Commerce Server Catalog Genius) has some really in-depth posts about the Catalog subsystem that anyone using CS2007 should read again and again.
Securing the catalog system
Improving the catalog search experience in Commerce Server 2007
Implementing Thesaurus support in the catalog system
New Catalogset features in Commerce Server 2007
Caching in the catalog system
Vinod Kumar [MSFT] – TM (The Man!) has also begun publishing some excellent posts, especially if you plan to extend the new Orders system in Commerce Server 2007.
Orders DataMigration From CS2002/2000 to CS2007
Extensibility Notes
Changing Column Matching for OrderTemplate and Basket Persistence
Mapping Weakly Typed Properties to Storage
Extending the Orders System
Technorati Tags: Commerce Server
by community-syndication | Aug 17, 2006 | BizTalk Community Blogs via Syndication
We just released two great resources for BizTalk BAM aficionados. The first,
BAM Frequently Asked Questions covers great topics like BAM tracing, common errors, maintaining the BAM databases and more. Good stuff.
I really like the second doc.
Developing with BAM goes through the BAM API and explains (with code snippets) how to use the API. Obviously you could write
volumes on the tactics for BAM programming, but this document is a great start for those looking to get more out of BAM.
Technorati Tags: BizTalk