October Rules Fest 2009. See you in Dallas

I’ve largely finished my presentation for the October Rules Fest 2009 conference in Dallas at the end of the month. I’m speaking on complex event processing (CEP). My plan is to provide a broad survey of CEP technologies, chiefly concentrating on the similarities and differences between event stream and rules processing. There has been a lot of interest and activity around event processing in the rules community in recent years, and not a little controversy about the best approaches and, indeed, the role, if any, of Rete rules engines in detection of complex events. Constructing the presentation has been something of a journey for me, and hopefully it will prove of interest to those attending the conference.

This is rather last-minute plug for ORF 2009. It is the second year of this ‘alternative’ rules conference which is differentiated by a clear and unashamed focus on technology (there are other well-established rules conferences which focus on application at the business level), and by a wide-ranging interest in several related areas of science and research. For me, some of last year’s highlights included hearing Gary Riley explain how he managed to squeeze so much performance out of the CLIPS engine and listening to Dan Levine’s talk on rule-based mechanisms in the human brain. It’s that kind of event.

The major rule-processing vendors (ILog (now IBM), FICO, Tibco, etc.,) are well represented at the event together with the JBoss team. Charles Forgy, who came up with the Rete algorithm three decades ago, is a star speaker (a fascinating talk is promised on how to maximise the benefits of parallelisation in rules engines). I’m particularly looking forward to hearing Andrew Waterman’s talk on the use of rules processing in game-playing software used to promote sustainability and development of natural resources in Mexico. I’ve been aware of this project for some time. Greg Barton will be reporting on his experiences at Southwest Airlines. There are interesting sessions on rule modelling and aspects of rule languages and DSLs, plenty on CEP, and various talks on constraint programming, rule verification and other topics. And, to remind us all that technology, for technology’s sake, is never a good idea, John Zackman will be there to talk about the role of rules in Enterprise Architecture.

ORF 2009, only in its second year, offers an incredibly varied diet for the rule technologist. Together with the boot camps and introductory sessions at the beginning of the programme, it offers practical hands-on experience, a chance to learn about rules processing in depth, a showcase for the wide-ranging application of rules in many different areas of IT and an insight into many areas of research.

Places are still available, I understand. The cost is kept as low as possible by the conference organisers, so visit http://www.octoberrulesfest.org for more information and book in while you can.

Using the WCF SQL Adapter to submit messages to SSB queues from BizTalk

This post is a follow-up to http://blogs.msdn.com/adapters/archive/2008/06/30/using-the-wcf-sql-adapter-to-read-messages-from-ssb-queues-and-submit-them-to-biztalk.aspx and explains how to push a message using the WCF SQL Adapter from BizTalk to a SQL Service Broker (SSB) queue.


 


Scenario


 


1.       An XML message is dropped to a file share


2.       This XML message is made available to the WCF SQL Adapter by using the File Adapter


3.       The WCF SQL Adapter then pushes this XML message to a preconfigured SSB queue by invoking a Stored Procedure


 


Create the database artifacts required for the SSB conversation


 


1.       A message type, which denotes the format of the message in the queue


2.       A contract, which denotes the conversation between a sender and a receiver and also includes the type of message flowing between them


3.       The Initiator & Target queues, where messages are stored


4.       The Initiator & Target services, which utilize the above queues


 


USE master;


GO


ALTER DATABASE <your db name here>


    SET ENABLE_BROKER;


GO


USE <your db name here>;


GO


 


CREATE MESSAGE TYPE


    [//SqlAdapterSSBSample/RequestMessage]


    VALIDATION = WELL_FORMED_XML;


 


CREATE CONTRACT [//SqlAdapterSSBSample/SampleContract]


    ([//SqlAdapterSSBSample/RequestMessage]


    SENT BY INITIATOR


    );


 


CREATE QUEUE InitiatorQueue1DB;


 


CREATE SERVICE


    [//SqlAdapterSSBSample/InitiatorService]


    ON QUEUE InitiatorQueue1DB;


 


CREATE QUEUE TargetQueue1DB;


 


CREATE SERVICE


    [//SqlAdapterSSBSample/TargetService]


    ON QUEUE TargetQueue1DB


    ([//SqlAdapterSSBSample/SampleContract]);


 


5.       A stored procedure, say InitiatorSP, that will take the message as an argument and push it to the SSB queue. Let’s use the name RequestMsg for the argument.


 


SET ANSI_NULLS ON


GO


SET QUOTED_IDENTIFIER ON


GO


CREATE PROCEDURE [dbo].[InitiatorSP]


      @RequestMsg xml


AS


BEGIN


      DECLARE @DlgHandle UNIQUEIDENTIFIER;


      BEGIN DIALOG @DlgHandle


      FROM SERVICE


      [//SqlAdapterSSBSample/InitiatorService]


      TO SERVICE


      N’//SqlAdapterSSBSample/TargetService’


      ON CONTRACT


      [//SqlAdapterSSBSample/SampleContract]


      WITH ENCRYPTION = OFF;


      SEND ON CONVERSATION @DlgHandle


      MESSAGE TYPE


      [//SqlAdapterSSBSample/RequestMessage]


      (@RequestMsg);


END


GO


 


Create the BizTalk artifacts


 


1.       Start the BizTalk Server 2009 Administration Console


2.       Create a new BizTalk application, say SSBSendApplication


3.       Create a new Receive Port, say FileReceivePort and add a new Receive Location, say FileReceive


a.       Set the Type to File and configure the Receive Folder to point to a local share, say c:\in


4.       Create a new Static One-way Send Port, say SqlSendPort


a.       In the General tab,


                                                               i.      Set the Type to WCF-SQL


                                                             ii.      Click Configure and set the properties as follows


1.       In the General tab, set


a.       Address – the format is “mssql://<servername>/<instancename>/<databasename>”. For example, on my machine (using the default instance of SQL server), mssql://localhost//SSBTestDb (where SSBTestDb is the name of my database)


b.      Action – the format is “TypedProcedure/<schemaname>/<storedprocedurename>”. For example, in my case, it is TypedProcedure/dbo/InitiatorSP


2.       In the Messages tab, select Template and fill in the XML box with the following


 


<InitiatorSP xmlns=”http://schemas.microsoft.com/Sql/2008/05/TypedProcedures/dbo”>


<RequestMsg>


<bts-msg-body xmlns=”http://www.microsoft.com/schemas/bts2007″ encoding=”string”/>


</RequestMsg>


</InitiatorSP>


 


*Note that this approach requires that the xml encoding is string.


 


                                                            iii.      Leave the other properties as is


b.      In the Filters tab, add a filter BTS.ReceivePortName == FileReceivePort


5.       Create a new Static One-way Send Port, say FileSendPort


a.       In the General tab, set the Type to File and configure the Receive Folder to point to a local share, say c:\out


b.      In the Filters tab, add a filter BTS.SPName == SqlSendPort


6.       At this point the configuration of BizTalk application is completed, so start the application.


 


Send the message to SSB queue


 


1.       Drop a request file to the c:\in share (one that file receive port is using). Note that this exact message will show up in the SSB queue. Here’s a sample message


 


<RequestMessage>Hello World</RequestMessage>


 


Consume the message from SSB queue


 


1.       You can now consume the message from the SSB queue. On running the below query, you will see the above message.


 


DECLARE @DlgHandle UNIQUEIDENTIFIER;


DECLARE @RecvMsg XML;


RECEIVE TOP (1)


@DlgHandle=conversation_handle,


@RecvMsg = CAST(message_body as XML)


FROM TargetQueue1DB;


IF NOT (@DlgHandle IS NULL)


BEGIN


END CONVERSATION @DlgHandle;


SELECT @RecvMsg AS ReceivedMessage;


END

Search ALL tables query

Ever needed to scan the entire database for a specific value? I ran into problem twice in one week, so I put some thought into it, and hopefully it will help someone else.

DECLARE @wordToSearchFor varchar(50)
SET @wordToSearchFor = 'BizTalk Application Users' -- The word you search for

DECLARE @query varchar(500)
DECLARE SearchAll CURSOR FOR 
SELECT 'IF(SELECT COUNT(*) FROM [' + TABLE_SCHEMA + '].[' + TABLE_NAME
+ '] WHERE ['+COLUMN_NAME+'] = '''+@wordToSearchFor+''')>0
BEGIN SELECT * FROM [' + TABLE_SCHEMA + '].[' + TABLE_NAME
+ '] WHERE ['+COLUMN_NAME+'] = ''BizTalk Server Administrators''
PRINT ''[' + TABLE_SCHEMA + '].[' + TABLE_NAME + ']'' END'
FROM INFORMATION_SCHEMA.COLUMNS
WHERE DATA_TYPE LIKE '%CHAR'

OPEN SearchAll
FETCH NEXT FROM SearchAll INTO @query
WHILE @@FETCH_STATUS = 0
BEGIN
    EXEC (@query)
    FETCH NEXT FROM SearchAll INTO @query
END
CLOSE SearchAll
DEALLOCATE SearchAll

Hope this helps

MsgBoxViewer 10.15 now available

MsgBoxViewer 10.15 now available

Hello,


I just made available version 10.15 of MBV  : http://blogs.technet.com/jpierauc/pages/msgboxviewer.aspx


This build fixes some bugs of version 10.14 of course but provides also lot of new queries & rules and use now some VBS queries like “MSDTC Setting” , “RPC Settings”‘, ‘TCP Settings”, etc…


Using VBS script with WMI calls in these queries allow to target correclty this time 64bits registries on 64 bits Servers.
As you probably now, a 32 bits tool like  MBV using .NET remote registry functions can target only the 32 bits version of registry on a 64 bit server (by design with .NET) , so until now MBV returned sometimes invalid MSDTC settings for example when targeting 64 bits servers.
Using WMI in VBS script can workaround this limitation specifying the type of provider to load (32 bits or 64 bits).


I added also some new queries and one interesting query I added is the “Artifacts per host” one which list all artifacts used by each host.
For our Support teams for example, it is indeed critical to see very quickly which host manage which type of artifacts.


I also identify now what we name COM+ or MSDTC “Rollup package” from the software layers found on each server. We have again some dependencies with COM+ and obviously MSDTC so it is is important to know which COM+ or MSDTC version is installed on each server before to decide to apply possible COM+/MSDTC hotfixes.


You will find the list of new features in this version here :  http://blogs.technet.com/jpierauc/pages/msgboxviewer.aspx



Feel free to report me your comments, suggestions, and of course the bug  you could find 😉


Thanks


JP

SharePoint BizTalk Integration – Kent’s running hot!

My buddy Kent Weare is launching a great series of posts on pulling/pushing documents
in/from SharePoint and BizTalk. Using InfoPath to beautify what hard-core developers
have known for years – that thing called XML.

Kent’s just rolling up his sleeves and getting cracking – http://kentweare.blogspot.com/2009/10/biztalk-2009-sharepointwss-30.html

Well done Kent – looks great!

BizTalk: Long Running Processes – Friend or Foe?

Something that I’ve come across in recent years and it concerns me more and morelong
running transactions
.

For example let’s take an Insurance Company implementing a Claims Process.

The way it works is:

  • Design Long Running Business Processes around BizTalk Orchestrations

    Sounds great on the surface and since BizTalk 2004, the techniques for implementing
    this were easier.
    Basically – the BizTalk Environment will look after ensuring state is maintained,
    waiting Orchestrations are managed and Correlations are in place for return messages,
    that may return seconds, minutes, weeks or months later.

    So in this case we’d implement a main claims process manager which
    is runs for the duration the claim is active in the system.

    A Claim comes in, enters the System and the Claims Process Manager initiates and we’re
    off and running.

    A common technique with long running processes is to forcibly suspend biztalk
    messages
    that are in error. At a later date someone looks into the BizTalk
    Admin Console (or via a WMI query) and ’deals with’ the suspended messages.

    The benefit of these suspended messages is that they potentially can be resumed right
    where they left off and these messages are stored in the MsgBoxDB awaiting attention.

The reason why I don’t think this works:

  • Messages are immutable – meaning that while they’re in the MsgBoxDB they can’t be
    changed (technically we *can* changed these messages as a hack, but it’s *not supported*).
    So if the message is incorrect and in the overall process, we might fix the problem
    and resubmit that message – we can’t do this from within the MessageBox. We have to
    export the message out and provide some ’resubmit to biztalk’ port (usually a file
    port).
  • BizTalk MessageBoxDB is keeping state of the system. In process Claims are part floating
    around as part of our system (we could also be a bank processing Loans etc etc). If
    we lose the MessageBoxDB this could spell even more trouble.
  • Also system upgrade complexity moves up that extra notch, careful planning and various
    considerations need to be thought out. Pending Orchestrations have to be allowed to
    run through to completion; hydrated messages waiting to be sent through Ports, means
    that those ports must stay around until these messages are dealt with and many other.
  • Backup – despite the recent advancements in SQL Server 2008 (mirroring) we can’t take
    advantage of it in the BizTalk world.
    The supported Technique is to use Log Shipping – The recommended
    backup interval is 15 minutes so worse case your system is out 15
    minutes in the case of a crash.

    This is not entirely true on busy systems the actual log shipping process may take
    between 15-30 mins to backup. This means that during the time while log shipping backup
    is running, the system is not being backed up. So all in all your system could be
    running for 1hr (approx.) with no covering backup.

    This essentially is the state of your solution.

What Does Work.in my opinion.

  • Manage the State of your System in another area, such as SQL or SharePoint.
  • Where possible keep the Orchestrations short running.
  • Upgrades are simplier
  • System maintenance is simplier.
  • Provide a MSMQ or File Inbound Port for ’Resubmission into BizTalk’.
  • Use Content Based Routing to establish mutually exclusive processes.

Food for thought folks, from what I’ve worked on and noticed out in the field.

Mick.

Connecting to SQL Azure with SQL Server Management Studio

This morning I decided this was something I wanted to do. So, I BINGed around and saw others had already been down this path. However, their techniques didn’t work for me (this is still a CTP, change is certain, and perhaps that’s why). Eventually, I connected, and thought I would do a post here for the benefit of others who may have similar issues.

As has been stated, you need to cancel out of the initial connection dialog box you get when you start Management Studio, and then choose New Query to get a connection dialog. If you don’t do that, you will not be able to connect.

Some key points about my dialog below:

  • Server name: this is the FQDN (do NOT include the “tcp:” prefix the Azure portal shows in the connection strings for server name)
  • Login: this is just my user name as shown in the connection string on the portal (and not <username>@<servername>) as most posts show
  • Click the Options button an type in the name of the database you want to connect to

 

After doing that, if all goes well, you’ll get the following dialog, which you can ignore.

 

After I did that, I was able to do this:

Some more notes:

  • As Richard points out here, you need to specify a clustered index, or table creation will fail
  • As Ramaprasanna says here, most problems people have are firewall issues

 

Hope this help. Now that I’ve successfully flexed this brand new capability, I’m going to brainstorm with myself about what cool things I can do with it.