Rules: A response to James McGovern’s questions

I recently came across a post (http://duckdown.blogspot.com/2006/01/outstanding-questions-on-rules-engines.html) from James McGovern dating back to the beginning of the year is which he had asked a number of questions about Rules engines.   As Rules engines have become a subject of some interest to me over the last eighteen months or so, I though I would have a go at providing some answers.   In my own inimitable style, I of course wrote far too much to be accepted as a comment on James’ web site (sorry, James)!   So I’ve expanded my replies a bit more and posted them here at http://geekswithblogs.net/cyoung/articles/76795.aspx.   I hope they are of some interest.

Pipeline Testing Library – Part 2

Pipeline Testing Library – Part 2

Continuing from part
1
, we’ll now see how to handle schemas and execute receive and send pipelines
with the pipeline testing library.

Configuring Known Document Specifications

If you want to use an assembler or disassembler on your pipeline, you’ll need to make
sure that it can resolve the necessary schemas based on the fully qualified schema
type, or the root element name (namespace#root). To accomplish this, you’ll need to
make sure that the pipeline is aware of “known” document specifications before you
execute them, so that the mock pipeline context can return the correct value when
IPipelineContext.GetDocumentSpecByName() or IPipelineContext.GetDocumentSpecByType()
is invoked by the assembler/disassembler component.

The library supports this through the AddDocSpec() method on the ReceivePipelineWrapper
and SendPipelineWrapper classes. AddDocSpec() takes as an argument a Type handle referencing
the strongly-typed, SchemaBase-derived class generated by the BizTalk project system
when you compile an schema into a BizTalk Assembly. Here’s an example:

ReceivePipelineWrapper pipeline =

   PipelineFactory.CreateEmptyReceivePipeline();

pipeline.AddDocSpec(typeof(Schema2_WPP));

AddDocSpec() will automatically handle multi-root schemas and make all roots in the
schema known to the pipeline context, so you don’t have to worry about that. You can
also add as many document specifications as needed, as long as there are no conflicts
among them (i.e. don’t add two schemas with the conflicting namespace#root names).

Executing Pipelines

Now that we can configure the necessary components and schemas into our pipelines,
we are ready to execute them. For both receive and send pipelines, you execute them
using the Execute() method. However, it is defined with a different signature in both
the ReceivePipelineWrapper and SendPipelineWrapper classes: Receive pipelines take
a single IBaseMessage argument and return multiple messages in a MessageCollection
object. Send pipelines, on the other hand, take multiple messages in a MessageCollection
object as input and return a single IBaseMessage object as ouput.

Here’s a complete example of configuring and executing a receive pipeline:

>

ReceivePipelineWrapper pipeline =

   PipelineFactory.CreateReceivePipeline(typeof(ReceivePipeline1));


 

//
Create the input message to pass through the pipeline

Stream stream = DocLoader.LoadStream(“SampleDocument.xml”);

IBaseMessage inputMessage = MessageHelper.CreateFromStream(stream);


 

//
Add the necessary schemas to the pipeline, so that

//
disassembling works

pipeline.AddDocSpec(typeof(Schema1_NPP));

pipeline.AddDocSpec(typeof(Schema2_WPP));


 

//
Execute the pipeline, and check the output

MessageCollection outputMessages = pipeline.Execute(inputMessage);


 

Here’s a complete example of executing a send pipeline, with multiple inputs:

>

SendPipelineWrapper pipeline =

   PipelineFactory.CreateSendPipeline(typeof(Env_SendPipeline));


 

//
Create the input message to pass through the pipeline

string body =

   @”<o:Body
xmlns:o=’http://SampleSchemas.SimpleBody’>

      this
is a body</o:Body>”
;

MessageCollection inputMessages = new MessageCollection();

inputMessages.Add(MessageHelper.CreateFromString(body));

inputMessages.Add(MessageHelper.CreateFromString(body));

inputMessages.Add(MessageHelper.CreateFromString(body));


 

//
Add the necessary schemas to the pipeline, so that

//
assembling works

pipeline.AddDocSpec(typeof(SimpleBody));

pipeline.AddDocSpec(typeof(SimpleEnv));


 

//
Execute the pipeline, and check the output

//
we get a single message batched with all the

//
messages grouped into the envelope’s body

IBaseMessage outputMessage = pipeline.Execute(inputMessages);


 

 

BizTalk 2006 Development in Visual Studio 2005

One of the requirements for BizTalk 2006 is that you must use Visual Studio 2005 for development.  This makes sense because everything in BizTalk is now based on the .Net 2.0 Framework.  Microsoft has made a few modifications for the developer on this from.


If you look at the Properties of your BizTalk Project, you now have a couple of new options (You can right click your project, select properties and then navigate to Configuration Properties, Deployment).  You can now enter an Application Name which is a new feature of BizTalk 2006.  If you do not enter a name, it will use the default application (which defaults to BizTalk Application 1).  In this dialog window, you can also specify Redeploy = True (which actually works this time) and you can also specify to restart Host Instances upon deployment.  This really can help speed up your development time.


For Redeployment, you may have a scenario where you have a Schemas Project, Maps Project and an Orchestrations Project.  If you have everything deployed and you make a modification to a Schema in the Schemas Project, you can right click and specify Deploy.  The Tool will now take care of Undeploying the assemblies that rely on the Schemas Project (in this case the Maps and Orchestrations projects) and the new Schemas Project will be deployed.  This will not redeploy the Maps and Orchestrations Projects!  This saves you from having to undeploy everything by hand, but you will still have to go in and redeploy the projects that contained a dependency to the schemas.


Visual Studio 2005 has the same BizTalk Explorer as we had in BizTalk 2004.  If you use Visual Studio to create a Receive Port/Location or a Send Port, it will only look at the default application.  This can cause issues or confusion if you don’t realize what is going on.  If you specify an Application Name for your project and deploy, a new Application will get created for you.  If you then add a Send/Receive Port from within Visual Studio, it will not be using your new Application, but the default application of BizTalk Application 1.  You will need to open the BizTalk Admin Tool and move your Send/Receive Ports from BizTalk Application 1 to your new Application.  There is not much pain in doing this, but you just need to be aware of it.


To ease this process, you can Create and Deploy your Application via Visual Studio.  Then you can open the BizTalk Admin Tool and change the default Application from BizTalk Application 1 to your new project.  You can right click your Application and select Properties.  There is a checkbox that states:  Make this the default application.   Now the BizTalk Explorer in Visual Studio will be pointing to your new Application and all is well.


These are just some basic changes in BizTalk 2006, but they really help out with the development process.

NW Connected Systems May 9th Meeting

Hey All, If you’re local to the Seattle/Eastside area, please join us for the next NW Connected Systems User Group Meeting. We are really excited to present the upcoming Tuesday, May 9th meeting and look forward to seeing you. The focus of the discussion is on BizTalk’s Business Rules Engine. Please come ready with questions, problem statements, or general thoughts around development and implementation of the BizTalk toolset. Park on campus and there will be someone at the door to let you into the building. For more information about this event as well as past events, please visit the following site: http://www.nwconnectedsystems.org We will have food, beverages, and door prizes at the meeting. Meeting Details Date: Tuesday, May 9, 2006 Topic: BizTalk Business Rules Engine Speaker: Jurgen Willis, Microsoft Time: 6:00pm – 8:00pm (formal presentation @ 6:30) Location: Microsoft Campus Bldg. 35 / Kalaloch Room #2615 http://www.microsoft.com/mscorp/info/usaoffices/pacwest/redmond.mspx Speaker Jurgen Willis is a Program Manager on the Windows Workflow Foundation (WF) team, where he is responsible for the rules capabilities in WF. Prior to working on WF, Jurgen was a Program Manager on the Business Rules Engine that ships with BizTalk Server. Before joining Microsoft, he spent 9 years building applications and integration solutions for Fortune 500 companies. Warmly, Brennan O’Reilly (User Group Steward)…

New BizTalk Adapters for WSE 3, SalesForce.com, and SQL Service Broker

Wow, the guys at Two Connect have been very busy.  They have released three new adapters for BizTalk Server 2006. 



They are:



  • SalesForce.com (integrated with them in the past through .net as a BizTalk Adapter would be great)

  • Web Services Enhancements (WSE) 3.0

  • SQL Server Service


They have some upcoming web casts along with members of the Microsoft Product Teams to cover these exciting new adapters.



Make sure you check them out.  You can get more information on Jesus Rodriguez blog.

WSE 3, SalesForce.com, and SQL Service Broker BizTalk Adapters

Wow, the guys at Two Connect have been very busy.  They have released three new adapters for BizTalk Server 2006. 


 


They are:



  • SalesForce.com (integrated with them in the past through .net as a BizTalk Adapter would be great)

  • Web Services Enhancements (WSE) 3.0

  • SQL Server Service

 


They have some upcoming web casts along with members of the Microsoft Product Teams to cover these exciting new adapters.


 


Make sure you check them out.  You can get more information on Jesus Rodriguez blog.

Writing to Eventlog while working with maps using Eventlog Functoid in Biztalk Server 2004

While testing our maps at times when it fails we feel that what could have went wrong and we cant trace it because we don’t know what the output of the functiods are. Normally in a single map we use normally 50-60 functiods.Were do we start debugging our Map. It’s tough when we are under pressure to deliver it on time to debug a map. But it’s a part of our job to make sure that our map is stable. I thought of this functiod while working on a project where in I wanted to see the output of a specific Scripting functiod.


So I just planned to have one which will write to Eventlog any output of data type string. So now what we need to do is just use this functiod to insert into Eventlog.This functiod will return the string after writing it to Eventlog. So just at an overhead of 1 functiod we can test our map to perfection.Just build the solution and drop the EventlogFunctiod.dll in


C:\Program Files\Microsoft BizTalk Server 2004\Developer Tools\Mapper Extensions


And  also GAC it. Then add it in the toolbox and use it in your maps.


Hope it helps you in your project.


The code is given below


 


using System;
using Microsoft.BizTalk.BaseFunctoids;
using System.Reflection;
using System.Xml;
using System.Diagnostics;


//Summary
//This Class will write an entry in event log.
//It will take up 1 parameter as input and will output the same parameter after writing it to eventlog
namespace EventlogFunctiods
{
 public class EventlogFunctoid: BaseFunctoid
 {
  public EventlogFunctoid()
  {
   this.ID = 6464;   
   SetName(“EventLog”);
   SetTooltip(“Please pass only 1 parameter”);
   SetDescription(“This Functiod writes the specified parameter to Eventlog.It takes up only 1 paramter”);   
   this.SetMinParams(1);
   this.SetMaxParams(1);
   SetExternalFunctionName(GetType().Assembly.FullName,
    “EventlogFunctiods.EventlogFunctoid”, “EventlogFunc”);
   this.Category = FunctoidCategory.String;
   this.OutputConnectionType = ConnectionType.AllExceptRecord;
   AddInputConnectionType(ConnectionType.AllExceptRecord);   
  }
  public string EventlogFunc(string param1)
  {
   System.Diagnostics.EventLog.WriteEntry(“Eventlog-Functiod”,param1);
   return param1;   
  }
 }
}


If you find it tough to implement….just download the solution which is available at


http://www.gotdotnet.com/Community/UserSamples/Details.aspx?SampleGuid=e986cc9e-bf71-4740-9dfc-1a2d166f8cad

Any queries or bugs would be appreciated because “To err is human”.

Pipeline Testing Library – Part 1

Pipeline Testing Library – Part 1

As I mentioned on a previous
post
, I’ve been working on a helper library to test BizTalk Pipelines and Custom
Pipeline Components using NUnit or your unit testing framework of choice. I’ve uploaded
V1.0 of the library, which you can download from here.

The library is pretty straightforward to use, but I’m going to document its features
on this and following posts so that you can get started with it easily.

Preparing to use the library

First of all, you’ll need to build the library from the downloaded source code distribution,
using Visual Studio 2005. The distribution includes the unit tests for the library,
which you can run to verify it is working correctly, if you want.

To use the library once you’ve built it, you’ll want your projects to reference the
following three libraries:

  • Winterdom.BizTalk.PipelineTesting.dll

  • Microsoft.BizTalk.Pipeline.dll

  • Microsoft.BizTalk.Pipeline.Components.dll
  • >>

Creating Pipelines

To test a pipeline component, or a pipeline, the first thing you’ll need to do pretty
much is instantiate a pipeline. The library supports two different ways of doing this:
Create an empty pipeline and add the necessary components using code, or create an
instance of an existing BizTalk pipeline. Both of these operations are supported for
send and receive pipelines through the methods in the PipelineFactory class.

  • CreateEmptyReceivePipeline() and CreateEmptySendPipeline() create a new receive or
    send pipeline respectively, that has no components configured on any of its stages.
    You then are responsable for adding the necessary components to configure your pipeline
    as necessary; I’ll show later how this can be done.

    Here’s an example of how these methods can be called:

    SendPipelineWrapper sendPipeline =

       PipelineFactory.CreateEmptySendPipeline();

    ReceivePipelineWrapper receivePipeline =

       PipelineFactory.CreateReceivePipeline(typeof(XMLReceive));

  • CreateReceivePipeline() and CreateSendPipeline() create a new instance of an existing
    BizTalk pipeline. Each of these methods take a Type handle as an argument. When you
    compile a BizTalk project that contains a pipeline, said pipeline is generated as
    a new CLR type into your BizTalk assembly (with the same name as the .BTP file it
    is derived from).

    This is very convenient, as it allows you to just add a project or file reference
    to the BizTalk project where you designed the pipeline and use the type directly.
    In fact, built-in pipelines like the XMLTransmit or XMLReceive pipeline can be used
    just this way: simply reference the Microsoft.BizTalk.DefaultPipelines.dll assembly
    and write this:

    using Microsoft.BizTalk.DefaultPipelines;

    //

    ReceivePipelineWrapper receivePipeline =

       PipelineFactory.CreateReceivePipeline(typeof(XMLReceive));


     

  • >

Configuring Components

If you’ve created an instance of an existing pipeline, you’ve probably set to
go and can skip this part (though you can still add new components to an existing
pipeline this way), if not, you’ll now want to add some components to each of the
stages you’re interested in to your pipeline.

To do this, you can use the AddComponent() method of each of the pipeline classes,
which take an object implementing IBaseComponent (a pipeline component) and an object
of type PipelineStage representing the stage you want to add that component to. The
PipelineStage class contains a set of static readonly objects you can use to refer
to the possible stages you can use, so you’ll never create an instance of this type
directly.

So, for example, if you want to add a new XML Disassembler component to the disassemble
stage of a receive pipeline, you can write this:

ReceivePipelineWrapper pipeline =

   PipelineFactory.CreateEmptyReceivePipeline();

IBaseComponent component = new XmlDasmComp();

pipeline.AddComponent(component, PipelineStage.Disassemble);


 

Here’s an example of adding an encoding component to a send pipeline:

>

SendPipelineWrapper pipeline =

   PipelineFactory.CreateEmptySendPipeline();

MIME_SMIME_Encoder encoder = new MIME_SMIME_Encoder();

encoder.AddSigningCertToMessage = true;

encoder.ContentTransferEncoding =

   MIME_SMIME_Encoder.MIMETransferEncodingType.Base64;

encoder.EnableEncryption = true;

pipeline.AddComponent(encoder, PipelineStage.Encode);


 

In a following post I’ll talk about configuring document specifications (schemas)
and executing the pipelines.

Document Explorer 2005 is installed from…?

Did I miss something or is the answer to this question harder to track down than it should be?


As documented in the ‘BizTalk Server 2006 Installation Guide – Multiserver.doc’, Document Explorer 2005 is a dependency for the BizTalk 2006 documentation.  So the operations team on my current engagement asks me where it comes from… and I respond that that is a good question and that I’ll have an answer for them in a couple of minutes. 


After a few tens of minutes I’ve inferred that it is part of the Visual Studio 2005 installation, based on this document: http://msdn.microsoft.com/vstudio/support/AdminReadme/default.aspx   ‘Help on Help’ leads me to believe that it could be (in the future?) installed from multiple products.