Using data driven workflow activities and repeated correlation

How’s that for a title? The driver for this post is that I’m training a lot of folks right now on using WF to build workflows, and when we cover correlation, the example we use has two branches in a parallel activity where we correlate within each branch. The question inevitably comes up about how to do this dynamically, where the number of “branches” is driven by data. So, I whipped up this example that uses code from the Microsoft Windows Workflow Foundation Labs (Beta 2) to showcase the Replicator Activity and the Conditioned Activity Group (CAG). These two activities provide for richer models of business rule or data driven replication using correlation.
In order to enable my example, I have created a composite activity that wraps up the voting behavior found in the communications lab. This gives me a reusable component that I can then use in multiple places in my workflow, and it also gives me the ability to set properties on the set of activities (now my single composite activity) for each instance. For example, one of the key properties I can set is the alias, or name, of the user who should vote. In addition, I can define events at the composite activity that my workflow can listen for and handle. This allows my workflow to take action when a voter approves or rejects a ballot. Finally, I define a correlation token within the activity and set the parent activity to the composite activity itself. This provides the context in which my correlation token is valid and is the key to allowing me to correlate within each activity. Note that it is not required that you do this in a composite activity, the parent activity for your correlation token can be a sequence or some other container. Finally,I added a property to my workflow called Aliases of type List<string> which holds the data for my workflow. I pass in this data from program.cs via parameters to the workflow.
In my workflow, I have two main activities, the first is a replicator which shows how to use this activity to do data driven correlated activities, and the second is the Conditioned Activity Group which allows you to use rules/conditions to determine what branches of activities to run, and when I am done running them.
The replicator activity in my example has the InitialData property bound to the workflow property for aliases, so I am going to run whatever activities are in the replicator as many times as I have items of data. I also need to pass that data to each instance of the voting activity before it is executed (specifically, the alias it should use). I use the Child_init event on the replicator to set the alias based on the current data item. The current data item lets me get the current item in my collection of data and I can then use that to configure the currently executing activity or set of activities. I also set the execution type property on my replicator to indicate if I want all the branches to run in parallel, or in sequence. Very powerful option!
For the CAG, I have included two voting activities, and bound their aliases to two different instances of input data on the workflow. I’ve done this using indexed values from the List<string> property. When dealing with a CAG, you have to set the “when” property on each branch of execution which indicates the rule or code to determine if the particular branch should execute. On the CAG itself, you provide a “Until” property that likewise specifies code or rules, but these provide the condition under which execution should stop. So, the CAG will continue running (foreach/while style looping) until the “Until” condition is met. On each interation, it will use the “when” condition on each activity branch to determine whether it should run that branch. So you get replication, but with rules to guide it. You get a lot more than that, but this is a simple example.
When you run the program, it might help to disable one activity and focus on the output by the other. Running the replicator will give you three vote dialogs, each with the appropriate users name, and the console will appropriately log their votes. We correlate the response back into the workflow on each instance of the votingactivity. When you run the CAG, you’ll see that the first time through, both voting activities will run. Until you vote “yes” on one of the items, the CAG will continue to run and use the “when” condition to determine which activity branches to run. Play around with your voting and notice what happens if you continue to vote nowith one person for a few rounds, then vote yes.
The final thing to notice is that we have a need to catch the situation, with our CAG, where someone voted yes, but the other person has not yet responded. If we don’t deal with this scenario, when the second person votes, they will get an exception. We use the WorkflowQueueInfo data from the workflow instance to query the workflow and see if it is waiting for any responses. If it is not, then we gracefully exist instead of trying to raise the event. If it is, then we go ahead and raise the event. This is one way you can attempt to avoid the EventDeliveryFailedException.
Hopefully these examples will prove useful for someone. I plan to do my best at creating more examples to share. As always, feedback is welcomed. You can get the code here.

One Solution For Handling Base Addresses in the Jan CTP of WCF

One of the changes that occurred on the way to the Jan CTP of WCF was that you must supply a base address to the service host class when constructing it in order to get a metadata access point. You now use the base address to access the meta data documentation page and can’t just use an absolute address in the configuration file and get metadata. Note that you can still use an absolute address in the configuration file, you just don’t automatically get a documentation/metadata functionality if you don’t use the base address as well.
The problem with this is that all of the examples from Microsoft using this new requirement show reading these base addresses from the appSettings in configuration files. This is for good reason. If anything is going to change in your addressing as you move from one environment to the next, it is probably the base address where you include server names, so it makes sense to store this information in the configuration file so that it can be easily updated without having to recompile the service code. Being a lazy programmer, I don’t want to have to write the same configuration code over and over, so I createda service host base class that does it for me.
My example, which can be found here, defines the service host derived class that you can use to host your services along with the appropriate configuration code to allow for a custom configuration section that contains the base addresses for your service.The service host base class overrides the OnApplyConfiguration method and adds the addresses from the configuration file into the collection of base addresses for you.
Feel free to try it out and provide any feedback. I don’t know what the final plans are from MS on this, but hopefully this will make someone’s life a bit easier, at least in the short term.
Update: some people have had trouble opening this solution. The solution was created in VS 2005 Team Edition for Software Developers. It contains the main project as well as a test project and some solution items. If you experience trouble, simply open Visual Studio 2005 and open the single project (PS.WCF.Configuration) which is a simple c# project.
Update: I’ve updated this sample to work with the Feb CTP. I don’t like this solution as much b/c it is not as clean, but the BaseAddresses collection is now readonly so I can’t update it in the ApplyConfiguration method. Instead, I am creating the collection in my constructor, which I really don’t like, but it does work. I’m hoping this gets easier in the later releases, not harder.

External File Dependency in NUnit

Of late I have been doing alot of work on NUnit, a popular open source Unit Testing framework. The area in which I have spent the most time is extending the available Assert models to accomplish new things. Today I finished and have sent in to be merged a FileAsserter which will compare, byte by byte, any file with any other file. The work is based on some work started by Darrel Norton which I extended and merged with the NUnit way of coding Asserters. In the process of writing the unit tests to support this functionality I found myself in the need to have external files which could move gracefully along with the test library. Easier said than done.

Fortunately for me I remember reading
about a way to handle this some time ago on Scott Hanselman’s blog
. He was in
fact just quoting Patrick
Cauldwell. Both of the variations of handling this were good, but not perfectly
portable. I realized that this could be encapsulated into a re-usable component which
implemented the IDisposable interface. This would allow me to use the using() statement
and ensure that files were always cleaned up rather than accidently forgotten when
you called only half of the routines presented by Scott or Patrick. So without further
ado, here is my TestFile class.

  public class TestFile : IDisposable

  {
    
private bool _disposedValue
=
false;
    
private string _resourceName;
    
private string _fileName;

    public TestFile(string fileName, string resourceName)
    {
      _resourceName = resourceName;
      _fileName = fileName;

      Assembly a = Assembly.GetExecutingAssembly();
      
using (Stream s
= a.GetManifestResourceStream(_resourceName))
      {
        
if (s
==
null) throw new Exception(“Manifest
Resource Stream “
+ _resourceName +
was not found.”
);

        using (StreamReader sr
=
new StreamReader(s))
        {
          
using (StreamWriter sw
=
File.CreateText(_fileName))
          {
            sw.Write(sr.ReadToEnd());
            sw.Flush();
          }
        }
      }
    }

    protected virtual void Dispose(bool disposing)
    {
      
if (!this._disposedValue)
      {
        
if (disposing)
        {
          
if (File.Exists(_fileName))
          {
            
File.Delete(_fileName);
          }
        }
      }
      
this._disposedValue
=
true;
    }

    #region IDisposable
Members

    public void Dispose()
    {
      
// Do not
change this code. Put cleanup code in Dispose(bool disposing) above.


      Dispose(true);
      
GC.SuppressFinalize(this);
    }

    #endregion

  }

BizTalk 2004: Orchestration: Relationship of Receive port shape and Receive shape, Send port shape and Send shape

BizTalk 2004: Orchestration: Relationship of Receive port shape and Receive shape, Send port shape and Send shape


I don’t know would it be useful for somebody…
I’m constantly trying to understand the rules for different BizTalk tools.
For example, BizTalk Orchestration Designer:


Can I receive one message with the same type from different Receive port to one Receive shape?


 No.


 Pict.1


 


Can I receive the same message with the same type to several Receive shapes?


 Yes.


 Pict.2


 


Can I send the same message to the different Send ports?


 No.


 


Pict.3


 


Can I send the different messages with the same type to several Send ports?


Yes.


 Pict.4


Why there are these limits?


Quite obvious (??):


Inside Message Box the different messages with the same type transfered in the same manner. That’s why for the messages with the same type permitted only one receive and one send endpoint.


Rule: “Links for the messages with the same type are always gathered in one received/send point on the port panel


 


Try to link different Ports with the same message type with different Reseive shapes:



Pict.5.

We’v got an error #1:

…odx…: error X2214: you must specify at least one already-initialized correlation set for a non-activation receive that is on a non-selfcorrelating port
    : for example, mark the receive Activate property as True
    : or, mark the port Binding property as Direct and the Partner Orchestration Port as Self Correlating
    : or, check a correlation on the receive Following Correlation Sets property

Good.

If we’ve tried to mark (as recomended) the second Reseive shape with Activate property as True we got the different error #2:

…odx…: error X2071: an activatable receive must be the first executable statement in a service

ok. The second error message did not “correlate“ with first one. Sure, the error message #1 is not about my stupid changes 🙂

If I tried to mark the first Receive shape with “Initializing Correlation Sets“ and the second one with “Following  Correlation Sets“, (It’s not so stupid) I’ve got the error #3:

…odx…: error X2259: in a sequential convoy receive the ports must be identical
    …Accounting.odx(353,22): could be ‘R_Schema1’
    …Accounting.odx(357,13): or ‘R_Schema1_2’

Excelent! That’s mean “my“ rule works!

If you have comments, please, give me a feedback!
Regards,



Leonid Ganeline
BizTalk Developer

Checklist before performing upgrade from BizTalk Server 2004.

Thought would add a checklist that you need to run through before upgrading from BizTalk 2004.  Though you can find all this in the upgrade guide, I wanted to emphasize on a few things. If you follow upgrade guide religiously, the upgrade experience should be seamless.


 


%u00b7         The user performing the upgrade must have the following rights:


%u00b7         Administrator on the local machine


%u00b7         SQL Server System Administrator rights on the SQL Server


%u00b7         BizTalk Server Administrator


%u00b7         SSO Administrator


%u00b7         The Network Service account must have write access to %windir%\temp.


%u00b7         Enterprise Single Sign-On Master Secret Server must be running at the time of upgrade. The SQL Server which hosts the SSO database regardless it is running locally or at remote server at the time of upgrade.


%u00b7         IIS 6.0 Common Files component must be installed if you would like to publish BizTalk Web services in BizTalk Server 2006 for the following:


%u00b7         IIS Common Files are needed because of the runtime machines which providing the ASP.NET Web services require them.


%u00b7         IIS Common Files are needed because of when the administration machines which used to pack BizTalk applications require them to access the IIS metabase on the remote machines if the application contains or references the remote web directories.


%u00b7         If you configure all BizTalk Server 2006 databases to be BizTalkMgmtDb except BizTalkEDIDb and SSODB, you will not be able to upgrade to later version. Build to build upgrade on consolidated databases is not supported.


%u00b7         SQL Server 2000 Analysis Services client tools must be installed on the computer in order to perform the upgrade on Analysis Service databases.


%u00b7         In SQL Server Analysis Services 2000, dimension level can reference alias called “Error” or “Name”. Yet, in SQL Server 2005 Analysis Services, these words are reserved words. The upgrade may fail if custom BAM views on OLAP cubes are deployed. To work around this, modify the Alias name in Excel or the Name attribute of that Alias element in Xml before the upgrade.


%u00b7         If you have BizTalk Server 2004 version of MQSeries Adapter and MSMQ Adapter installed on the machine, export the binding and then uninstall them before the upgrade.


%u00b7         If you are upgrading Business Activity Services, it is recommended that you synchronize the BAS site with the TPM database before you upgrade. For more information about running the Sync command, see “How to Synchronize BAS Data” in BizTalk Server 2004 Help.


%u00b7         A BAM Data Transformation Services (DTS) package must not be running when upgrade is performed, otherwise there may be loss of data or corruption of a cube.


 


 

Migration Paths for BizTalk Server

Adding the supported migration paths to BizTalk Server 2006. You can find this information in the installation guide as well.

Upgrade from BizTalk Server 2000 and BizTalk Server 2002


For upgrading from BizTalk Server 2000 or BizTalk Server 2002 to BizTalk Server 2006, the process will not be automated. Instead, you can migrate from and interoperate with BizTalk Server 2002. To migrate from BizTalk Server 2002 to BizTalk Server 2006, launch a new BizTalk Server Migration Project from Visual Studio 2005 and follow the wizard to begin the migration. For more information, see http://go.microsoft.com/fwlink/?LinkId=61910.


Note   BizTalk Server 2000 will be leaving mainstream lifecycle support on June 30, 2006.


Upgrade from BizTalk Server 2004


BizTalk Server 2006 is designed to allow an easy upgrade from BizTalk Server 2004 with Service Pack 1. The process is automated and wizard based. The following table indicates the supported upgrade scenarios on SKU.





























BizTalk Server 2004 Editions


BizTalk Server 2006 Developer Edition


BizTalk Server 2006 Standard Edition


BizTalk Server 2006 Enterprise Edition


Developer


Yes


No


No


Partner


No


Yes


Yes


Standard


No


Yes


Yes


Enterprise


No


No


Yes


 

BizTalk 2004 – New Transaction Cannot Enlist in the Specified Transation

We recently encountered an issue on one of our BizTalk 2004 Groups that took quite a long time to figure out, so I thought I would post about this for self future reference and also in case anyone else ever stumbles across the same problem.


The group configuration consisted of two BizTalk servers, two SQL Servers clustered in an active/passive mode, one of these containing the Message Box the other hosting the other BizTalk databases. We also had a separate Analysis Services database.


Once everything had been installed and BizTalk was configured we noticed we were getting intermittent errors simply when browsing the adapters within the BizTalk Administration Console. These were related to transactions, as shown in the following capture:



Numerous checks were made to ascertain that MSDTC was functioning correctly and had the correct security settings. Generally, if you experience any errors that mention transaction in the description you first point of call should be to make sure that the MSDTC security settings are not restricting you. However, in our case this was all working fine and running tests using DTCPing, DTCTester and the like did not reveal anything either.


The other interesting (or should I say frustrating) aspect of this error was due to its intermittent regularity. This made it particularly difficult to debug. However, once we installed an orchestration on the server the error become more prevalent. We could now replicate a similar error on demand by simply attempting to stop an orchestration within BizTalk Administrator, with an error “A connection with the transaction manager was lost”.


After various investigations and discussions with PSS a short-term fix was found where taking the Management Database cluster resource offline, bringing it back online and restarting the SQL Server would solve the problem.


However the actual issue has been addressed with Windows 2003 SP1 (which was not installed on these servers), so if you have that applied you are in luck. If you cannot apply Windows 2003 SP1 though you must install the following HotFix that addresses the issue:


http://support.microsoft.com/default.aspx?scid=kb;en-us;895250

BizTalk 2004 – New Transaction Cannot Enlist in the Specified Transation

We recently encountered an issue on one of our BizTalk 2004 Groups that took quite a long time to figure out, so I thought I would post about this for self future reference and also in case anyone else ever stumbles across the same problem.


The group configuration consisted of two BizTalk servers, two SQL Servers clustered in an active/passive mode, one of these containing the Message Box the other hosting the other BizTalk databases. We also had a separate Analysis Services database.


Once everything had been installed and BizTalk was configured we noticed we were getting intermittent errors simply when browsing the adapters within the BizTalk Administration Console. These were related to transactions, as shown in the following capture:



Numerous checks were made to ascertain that MSDTC was functioning correctly and had the correct security settings. Generally, if you experience any errors that mention transaction in the description you first point of call should be to make sure that the MSDTC security settings are not restricting you. However, in our case this was all working fine and running tests using DTCPing, DTCTester and the like did not reveal anything either.


The other interesting (or should I say frustrating) aspect of this error was due to its intermittent regularity. This made it particularly difficult to debug. However, once we installed an orchestration on the server the error become more prevalent. We could now replicate a similar error on demand by simply attempting to stop an orchestration within BizTalk Administrator, with an error “A connection with the transaction manager was lost”.


After various investigations and discussions with PSS a short-term fix was found where taking the Management Database cluster resource offline, bringing it back online and restarting the SQL Server would solve the problem.


However the actual issue has been addressed with Windows 2003 SP1 (which was not installed on these servers), so if you have that applied you are in luck. If you cannot apply Windows 2003 SP1 though you must install the following HotFix that addresses the issue:


http://support.microsoft.com/default.aspx?scid=kb;en-us;895250

Stored Procedure to keep BizTalk backup directory clean

Since the Backup History Delete step does not clean up the backup files, I have make a stored procedure that cleans up the files also, add the stored procedure below to the BizTalk Management Database and change step3 of the backup job by adding “AndFiles“ to the end of the exiting stored prorcedure call.






CREATE PROCEDURE [dbo].[sp_DeleteBackupHistoryAndFiles] @DaysToKeep smallint = null


AS


 BEGIN


 set nocount on


  IF @DaysToKeep IS NULL OR @DaysToKeep <= 0


  RETURN


 /*


  Only delete full sets


  If a set spans a day such that some items fall into the deleted group and the other don’t don’t delete the set


 */



DECLARE DeleteBackupFiles CURSOR


FOR SELECT ‘del ‘ + [BackupFileLocation] + ‘\’ + [BackupFileName] FROM [adm_BackupHistory]


WHERE  datediff( dd, [BackupDateTime], getdate() ) >= @DaysToKeep


 AND [BackupSetId] NOT IN ( SELECT [BackupSetId] FROM [dbo].[adm_BackupHistory] [h2] WHERE [h2].[BackupSetId] = [BackupSetId] AND datediff( dd, [h2].[BackupDateTime], getdate() ) < @DaysToKeep )


 DECLARE @cmd varchar(400)


OPEN DeleteBackupFiles


 FETCH NEXT FROM DeleteBackupFiles INTO @cmd


WHILE (@@fetch_status <> -1)


BEGIN


            IF (@@fetch_status <> -2)


            BEGIN


                        EXEC master.dbo.xp_cmdshell @cmd, NO_OUTPUT


                        delete from [adm_BackupHistory] WHERE CURRENT OF DeleteBackupFiles


                        print @cmd


            END


            FETCH NEXT FROM DeleteBackupFiles INTO @cmd


END



CLOSE DeleteBackupFiles


DEALLOCATE DeleteBackupFiles



  END


GO





Virtual PC vs VMWare

Do you use virtual images?  I use them almost daily and when you work with something like BizTalk Server….it’s a must!  I’ve always been a fan of Virtual PC and really did not have any issues with it…until recently.  Jonathan Summers and I were preparing for the BizTalk First Look Clinics and we were frustrated by the performance of Virtual PC.  Jonathan has struggled with Virtual PC on this Dell Laptop and there are some other grumblings out there on the Internet.


Personally, I didn’t have too many issues until recently.  At the launch in Houston, my keyboard gave out on my VPC during my presentation.  My Virtual Machines started to lag in speed and then all of the sudden, I started getting a 2-3 second pause about every 10 seconds when using VPC….very frustruating.  I said no worries and just installed Virtual Server R2.  Not too many issues at first, but then my Virtual Machine Remote Client started acting up.  I started getting double keystrokes when typing and performance was still just ok.  I ended up just remoting into the virtual machines, but this wasn’t the greatest method either. 


So after all of this, it was time to give VMWare a try.  I used VMWare a long time ago and it wasn’t a pleasant experience.  I downloaded Workstation 5.5, installed and everything was fine.  Upon download, I also downloaded a Virtual Machine Importer tool so I could reuse my existing VPC machines.  I ran the importer and pulled in one of my BizTalk 2006 Beta 2 machines.  Holy crap…the VMWare machine was just as fast as my host machine…something I never experienced with VPC.  I think I’ve been sold on VMWare and here are a few of the benefits:



  • Faster performance – I’ve been able to load Windows 2003 and XP with no issues

  • Display support is better

  • USB support is included

  • Faster pause or hibernation – you can pause a machine with the click of a button and it’s paused

  • Faster start up when paused

  • No more loud beeps when you do something wrong – anyone have this issue with VMRC?

  • Better disk allocation – VMWare can be set up to use 2 Gig Files automatically which will lead to less fragmentation

  • Did I mention faster performance?

I still think Virtual PC is a good product, but my demo yesterday was done using VMWare and I think I’ll be using it in the future.  Anyone else have any issues with Virtual PC?


 


It didn’t take very long to convert, but hboy was there a difference in the performance.  The VMWare machine was just as fast as my host machine…something I never experienced with Virtual PC.  I was able to load the VMWare Tools which allowed me to have better displays and it also supports USB.