by community-syndication | Mar 7, 2010 | BizTalk Community Blogs via Syndication
I guess most of my readers are from BizTalk/BPM background. So, some of these post may not be relevant to them. I'm just using the blog as reference archive for all of the issues I'm encountering during my Silverlight learning. So, please feel free to ignore id đ
Exception Detail:
System.InvalidOperationException was unhandled by user code
Message="The IModuleCatalog is required and cannot be null in order to initialize the modules."
StackTrace:
at Microsoft.Practices.Composite.UnityExtensions.UnityBootstrapper.InitializeModules()
at Microsoft.Practices.Composite.UnityExtensions.UnityBootstrapper.Run(Boolean runWithDefaultConfiguration)
at Microsoft.Practices.Composite.UnityExtensions.UnityBootstrapper.Run()
at UI.App.InitializeRootVisual()
at UI.App.Application_Startup(Object sender, StartupEventArgs e)
at System.Windows.CoreInvokeHandler.InvokeEventHandler(Int32 typeIndex, Delegate handlerDelegate, Object sender, Object args)
at MS.Internal.JoltHelper.FireEvent(IntPtr unmanagedObj, IntPtr unmanagedObjArgs, Int32 argsTypeIndex, String eventName)
InnerException:
Resolution:
Set the following settings on your modulecatalog.xaml file.
Build : Content (Default is page)
Cusom Tool Namespace: (just empty content, default will look something like MSBuild:MarkupCompilePass1)
Event though the extension of the file is xaml, we don't need to compile it. It's just going to be a content file available on the SilverLight app.
Nandri!
Saravana
by community-syndication | Mar 7, 2010 | BizTalk Community Blogs via Syndication
Hi all
I have just installed VMware WorkStation 7.0.1 in order to start building 64bit guest
OSâes in order to try out Windows Server 2008 R2 and SharePoint 2010. Microsoft Virtual
PC does not support 64bit guest operating systems, and since I really appreciate being
able to run guest operating systems in a window on my host PC I saw no other way out
than getting and installing VMware WorkStation.
Now, after installing it, I tried to create my first Windows 2008 R2 virtual machine,
but that failed because I hadnât enabled Virtualization Technology (VT) in my BIOS.
So I rebooted, entered BIOS and enabled it. That worked fine, and I now have a virtual
machine running Windows 2008 R2 64bit.
BUT, when I then wanted to fire up one of my old Microsoft Virtual PC virtual machines
that I had earlier saved, I got an error saying that the saved file was corrupt. I
had the choice of deleting the saved file or doing nothing. Since I needed the VPC,
I chose to delete the saved file information and hope that I could recreate what was
then lost.
Then, when starting up my next saved virtual machine from Microsoft Virtual PC, I
got the same error. I have now played around with it, and it simply seems that if
I save the state of a VPC and then turn on VT, the file gets corrupted and cannot
be used đ I even tried saving state when VT was enabled, and then I disabled it
and reenabled it. Saved file was again corrupt đ
This REALLY sucks! This means that you need to be really careful with when you save
the state and when you do not
—
eliasen
by community-syndication | Mar 7, 2010 | BizTalk Community Blogs via Syndication
Note: This blog post is written using the .NET framework 4.0 RC 1
Most of the time I used compiled workflows in Windows Workflow Foundation 4. Its nice and easy, you design the workflow, compile it and at runtime there is a .NET type you use to create and run workflows. The main drawback is that this approach isnât very flexible, sometimes you want to be able to change your workflow definition at runtime or store it in a database so recompiling isnât an option.
Fortunately we can also load a workflow from the XAML file itself and execute the resulting workflow activity. This is done using the ActivityXamlServices class that will let us load the XAML file and return an activity, to be exact it returns a DynamicActivity as a wrapper around your definition.
The simplest option is just to call Load() passing in the file name like this:
Activity workflow = ActivityXamlServices.Load("YourWorkflow.xaml");
If you are using activities, or other types, from the local assembly this is going to fail though and you need a slightly more verbose way of doing thing like this:
var settings = new XamlXmlReaderSettings()
{
LocalAssembly = typeof(SendForManualApproval).Assembly
};
var reader = new XamlXmlReader("YourWorkflow.xaml", settings);
Activity workflow = ActivityXamlServices.Load(reader);
We need to use the XamlXmlReaderSettings to indicate what the local assembly reference in the XAML is.
Enjoy!
www.TheProblemSolver.nl
Wiki.WindowsWorkflowFoundation.eu
by community-syndication | Mar 6, 2010 | BizTalk Community Blogs via Syndication
Tim Bass posted on âOrwellian Event Processingâ.I was involved in a heated exchange in the comments, and he has more recently published a post entitled âDisadvantages of Rule-Based Systems (Part 1)â.Whatever the rights and wrongs of our exchange, it clearly failed to generate any agreement or understanding of our different positions.I don’t particularly want to promote further argument of that kind, but I do want to take the opportunity of offering a different perspective on rule-processing and an explanation of my comments.
For me, the âred ragâ lay in Timâs claim that â…rules alone are highly inefficient for most classes of (not simple) problemsâ and a later paragraph that appears to equate the simplicity of form (âIF-THEN-ELSEâ) with simplicity of function. It is not the first time Tim has expressed these views and not the first time I have responded to his assertions. Indeed, Tim has a long history of commenting on the subject of complex event processing (CEP) and, less often, rule processing in ârobustâ terms, often asserting that very many other peopleâs opinions on this subject are mistaken. In turn, I am of the opinion that, certainly in terms of rule processing, which is an area in which I have a specific interest and knowledge, he is often mistaken.
There is no simple answer to the fundamental question âwhat is a rule?âWe use the word in a very fluid fashion in English.Likewise, the term ârule processingâ, as used widely in IT, is equally difficult to define simplistically.The best way to envisage the term is as a âcentre of gravityâ within a wider domain.That domain contains many other âcentres of gravityâ, including CEP, statistical analytics, neural networks, natural language processing and so much more.Whole communities tend to gravitate towards and build themselves around some of these centres.
The term ‘rule processing’ is associated with many different technology types, various software products, different architectural patterns, the functional capability of many applications and services, etc.There is considerable variation amongst these different technologies, techniques and products.Very broadly, a common theme is their ability to manage certain types of processing and problem solving through declarative, or semi-declarative, statements of propositional logic bound to action-based consequences.It is generally important to be able to decouple these statements from other parts of an overall system or architecture so that they can be managed and deployed independently.
As a centre of gravity, ârule processingâ is no island.It exists in the context of a domain of discourse that is, itself, highly interconnected and continuous. Rule processing does not, for example, exist in splendid isolation to natural language processing. On the contrary, an on-going theme of rule processing is to find better ways to express rules in natural language and map these to executable forms. Rule processing does not exist in splendid isolation to CEP. On the contrary, an event processing agent can reasonably be considered as a rule engine (a theme in âPower of Eventsâ by David Luckham). Rule processing does not live in splendid isolation to statistical approaches such as Bayesian analytics.On the contrary, rule processing and statistical analytics are highly synergistic. Rule processing does not even live in splendid isolation to neural networks.For example, significant research has centred on finding ways to translate trained nets into explicit rule sets in order to support forms of validation and facilitate insight into the knowledge stored in those nets.
What about simplicity of form? Many rule processing technologies do indeed use a very simple form (âIf…Thenâ, âWhen…Doâ, etc.) However, it is a fundamental mistake to equate simplicity of form with simplicity of function. It is absolutely mistaken to suggest that simplicity of form is a barrier to the efficient handling of complexity. There are countless real-world examples which serve to disprove that notion. Indeed, simplicity of form is often the key to handling complexity.
Does rule processing offer a âone size fits allâ.No, of course not. No serious commentator suggests it does. Does the design and management of large knowledge bases, expressed as rules, become difficult? Yes, it can do, but that is true of any large knowledge base, regardless of the form in which knowledge is expressed.
The measure of complexity is not a function of rule set size or rule form. It tends to be correlated more strongly with the size of the âproblem spaceâ (âsearch spaceâ) which is something quite different. Analysis of the problem space and the algorithms we use to search through that space are, of course, the very things we use to derive objective measures of the complexity of a given problem.This is basic computer science and common practice.
Sailing a Dreadnaught through the sea of information technology and lobbing shells at some of the islands we encounter along the way does no one any good. Building bridges and causeways between islands so that the inhabitants can collaborate in open discourse offers hope of real progress.
by community-syndication | Mar 5, 2010 | BizTalk Community Blogs via Syndication
San Diegans, time is running out, the Windows Azure conference (I blogged about it here) is *tomorrow*. This is a great opportunity to ramp-up quickly on what Windows Azure is, and how it can be used in the real world. Come see why everyone is so excited, and why everyone agrees that this is a major shift in our industry. This is not future-tech, the cloud isnât vapor anymore đ – this is live and production-ready today.
I will be presenting on Windows Azure platform AppFabric, and specifically how to leverage it to bridge between on-premise and off-premise (or, from-one-premise-to-another-premise).
Hope to see you there!
by community-syndication | Mar 5, 2010 | BizTalk Community Blogs via Syndication
Just a quick link to a post on Maximeâs blog:
http://maxime-labelle.spaces.live.com/Blog/cns!D8D9369449D177DA!236.entry
Maxime added support for deploying vocabularies and policies to the PowerShell provider for BizTalk. In our opinion this is the easiest way to deploy BRE artefacts.
For now it is only available when you grab and build the latest sources. It will be included in the final […]
by community-syndication | Mar 4, 2010 | BizTalk Community Blogs via Syndication
BizTalk Server is an enterprise product; there is no second thought about it. Any enterprise product will go through the phase of being left out with very older version on production environment. Ones the code is up and running in a production environment with live business, it becomes mission critical. Enterprise just don’t upgrade either their applications drastically or the platform on which they are running until there is a compelling business case behind it.
The organisation I’m working on is also in a similar situation, and I been tasked to put the future road map for BizTalk Server in the organisation. I just need to come up with proper reasons, why we should move on to latest version of BizTalk Server (keeping in mind the cost associated with it). We are currently on BizTalk Server 2006 (not R2), and our plan is to move to BizTalk 2009 R2 (skipping 2 versions in between 2006 R2, and 2009).
This list is not going to be exhaustive, the scenarios will vary from organisation to organisation based on the usage (example: B2B integrations, SAP integration, Health care with flat files etc). In our case we use BizTalk for couple of different scenarios.
- A BPM process, more like a human workflow kind of solution interfacing with one of our internal in-house BPM software.
- A composite Business Services layer on top of our middle tier integrations services.
So both the solution uses lot of standard Orchestrations, Maps, Schemas, WSDL, SOAP adapter, MQSeries adapter, and a custom adapter to talk to in-house BPM software, BizTalk web publishing capabilities, and BAM
Note: Some of the features were available in BizTalk Server 2006 R2 and BizTalk Server 2009 (example: WCF Support). This list just shows the cumulative gain for moving from BizTalk 2006 to BizTalk 2009 R2.
Reason #1: Nearing end of life (EOF) mainstream support
This is our number one reason for thinking about moving to BizTalk Server 2009 R2.
|
Product
|
Mainstream Support Retired
|
Extended Support Retired
|
|
BizTalk Server 2006
|
12/07/2011
|
12/07/2016
|
It’s not just the BizTalk server end of life threatens us; it’s also the end of life for Windows Server 2003 R2.
|
Product
|
Mainstream Support Retired
|
Extended Support Retired
|
Service Pack Retired
|
|
Windows Server 2003 R2 Enterprise x64 Edition
|
13/07/2010
|
14/07/2015
|
14/04/2009
|
Running your solution on out of support platform is going to be expensive and results in indirect costs. Apart from the cost reason, there may be scenario where the support cycle may take longer time to resolve issues.
Reason #2: Platform upgrade and performance gain
Platform upgrade for moving from BizTalk Server 2006 to 2009 R2 looks like this
|
From
|
To (at least)
|
|
Windows Server 2003 R2
|
Windows Server 2008 R2
|
|
SQL Server 2005
|
SQL Server 2008 R2
|
|
Visual Studio 2005
|
Visual Studio 2010
|
BizTalk Server (environment) performance is always closely tied to the performance of the platform upon which BizTalk server is installed/running. BizTalk Server heavily depends on the performance of SQL Server and Windows server itself. So moving to the latest platform should provide considerable performance gain (depending on the scenario) on the same application code base.
Reason #3: Virtualization Support:
Virtualization support for BizTalk Server 2006 is on best effort basis. Meaning, if you can reproduce the problem on a physical environment, support will be provided. But with 2009 Hyper-V based virtualization is fully supported.
Licensing around Virtualization
Text Snippet from http://www.microsoft.com/biztalk/en/us/pricing-licensing-faq.aspx
"Similar to SQL Server Enterprise, BizTalk Server 2009 ENT can be licensed for unlimited virtualized processors that are available on a single physical server. The customer will be required to license the number of physical processors on a server."
This is one of the key factors for us; we are seeing more and more projects getting done in BizTalk Server within the organisation. In some cases due to the volume and size of the project its not worth having a dedicated BizTalk Server environment at the same time due the criticality of the applications its not possible to run the applications in a shared environment.
With the support for virtualization and liberal licensing model, it will help us to isolate the applications from one another and keep the costs low.
Reason #4: Support for Windows Communication Foundation
This is one of the key factors for us to think about migration. The web service support on BizTalk server 2006 is provided primarily by the SOAP adapter and Web Services publishing wizard. By moving to 2009 R2 will open up the opportunity to Microsoft’s unified distributed platform Windows Communication Foundation (WCF) seamlessly from BizTalk Servers. WCF provides great enhancement like support for lot of WS * protocols. There is also considerable gain on the WCF adapter configuration.
Examples:
- Choice of Encoding,
- Transactions support using WS-Atomic Transaction
- Imposing restrictions of received message size,
- Better control on the incoming and outgoing message (whole envelope, body, and xpath)
Overall you’ll get better web services support compared to the SOAP adapter story.
Reason #5: Developer productivity enhancements
Just by moving to latest Visual Studio platform opens up lot of productivity enhancements for developers.
BizTalk Server 2009 R2 enhanced the mapper productivity to great extend. This is the first major update for the mapper tool UI since BizTalk Server 2004. There are features like
- Moving links between pages
- Searching for nodes
- Hiding out of context nodes
- Auto scrolling to the active nodes
- Relevance tree – show only mapped nodes, to reduce clutter etc
BizTalk Administration console improvements like:
- Ability to configure polling interval on host level
- Ability to export/import setting from one environment to another (ex PERF to PROD)
- Ability to setup host instance registry settings from console.
Support for testing maps.
This is not an extensive list, but you get the idea
Reason #6: Moving to IIS 7.0 from IIS 6.0
This is one of the less highlighted features. As part of platform alignment, it will help us to move from Internet Information 6.0 to 7.0. The difference between 6.0 and 7.0 is revolutionary. IIS 7.0 works on the principal of bare bone configuration, you add required modules your application demand. Again this is going to indirectly support your BizTalk environment configuration if your solution is heavily dependant on Web Services based SOA integration.
Reason #7: Support for new Visual Studio project template and support for MSBuild
BizTalk Server 2006 uses its own proprietary visual studio project template with its own extension points. This makes it harder to do things like MSBuild /Continuous integration etc. Moving to 2009 R2 will give consistent experience with other Visual Studio projects like class libraries.
Reason #8: Enterprise Service Bus (ESB) Toolkit 2.0
This will be one of the great candidates for moving to 2009 R2. Microsoft fully supports the Toolkit which got some nice features like Exception Handling framework with portal. Transformation service, Dynamic routing, construction composite services without using orchestration etc.
More information can be found at http://msdn.microsoft.com/en-us/biztalk/dd876606.aspx
Even though we are not using it at the moment, moving to 2009 R2 will help us think/architect new solution by taking some of the advantages of ESB Toolkit reducing the plumbing work.
Reason #9: Support for Host Integration Server 2009
One of the reasons for move to 2009 is the support for Host Integration Server 2009 which includes WCF Channel for WebSphere MQ support. Being an enterprise coming from IBM background its not a surprise we use MQ heavily in the organisation.
Reason #10: BizTalk Adapter pack and custom WCF LOB Adapter framework
This is one of the nice to have features for us.
Reason #11: Extended BAM Interceptor support for WCF and WF
BAM interceptors extend the functionality enjoyed by BizTalk server to Windows Workflow Foundation (WF), Windows Communication Framework (WCF), and other runtimes. By using the BAM interceptors, you can track your business processes without recompiling your WF or WCF solution – integration is done through a configuration file.
This ability provides the opportunity to bring your external WCF services to participate in the same BAM monitoring framework you build for your applications.
Also, Microsoft relaxed the licensing requirement to use BAM outside BizTalk Server Environment as long as you got a standard or enterprise license in your organisation (I can’t find any links to support this statement, but I remember that was the case).
Reason #12: Better system with cumulative bug fixes in the past 3 year.
The obvious one is cumulative bug fixes in the past 3 years.
Big downside if you don’t have a Microsoft software assurance
To upgrade from BizTalk Server 2006 to BizTalk Server 2009, you need to acquire Microsoft Software Assurance for BizTalk Server 2006. Acquiring Software Assurance for BizTalk Server 2006 will ensure you receive BizTalk Server 2009 at no additional cost. Otherwise, customers will pay full price for BizTalk Server 2009 if they want to upgrade.
There are various other areas of improvements like RFID, EDI, B2B, Share point integration, etc worth investigating based on your requirements. I also didn’t highlight lot of features, which we are not using example SCOM package upgrade, improved trading partner management etc
There are some deprecated features as well from 2006 to 2009 R2, but none of them are relevant to us.
- Human Work flow support
- Business Activity Services Support
- Migration of HAT functionality to BizTalk Admin Console, etc
Nandri!
Saravan
by community-syndication | Mar 4, 2010 | BizTalk Community Blogs via Syndication
I a previous post I described a way to deal with untyped messages in the Business Rule Engine. This allows for flexibility in scenarios where you want to use a single set of rules (lets call it an âuntyped policyâ) on multiple types of messages.
Untyped policies work great when tested directly in the Business Rules […]
by community-syndication | Mar 4, 2010 | BizTalk Community Blogs via Syndication
This is a follow up post to my previous post on this topic. The method described in that post doesnât seem to work when the policy is called from an orchestration. For more background information see this blogpost.
I this post I will use the exact same sample as in the previous post. These are the […]
by community-syndication | Mar 4, 2010 | BizTalk Community Blogs via Syndication
Iâve really enjoyed my time here in Dubai for the first-ever TechEd Middle East. Pooya Darugar, my track owner from the local Microsoft office, told me that this was actually the first âpaidâ technical conference to be held in the area, so it was a pretty major undertaking.
They had an excellent turnout, somewhere between 1600-1800 attendees, and even had folks showing up on the last day with money to get in. Suffice it say, the show was a great success.

Amory Somers Vine and the rest of the conference organizers did a fantastic job. And a special thanks to Pooya for doing another first-class job in running my tracs and taking care of us while we were here – I know everyone appreciated it. Kudos to the whole Teched ME team.

I delivered 5 presentations during the show on various connected systems topics and I participated on a panel discussion with Scott Hanselman, Hammad Rajjoub, and Ronald Sunarno on the last day:
- SOA201 – Building RESTful Services with WCF
- ARC201 – Windows Azure Platform Overview
- SOA301 – Service Virtualization
- SOA203 – Introducing AppFabric in the Cloud
- SOA302 – Whatâs New in WCF/WF 4.0
- IAT304 – Interactive Session: Agile Architect
Demos: you attended any of my sessions, you can download my demos here.
Pluralsight sponsored the show by providing all Teched ME attendees with a free 1-wk pass to the Pluralsight On-Demand! .NET training library. As an attendee, you should have received an activation card with the rest of your materials – donât forget to activate it right away. If you canât find the card, just contact us, and weâll send you one.
Thanks to everyone for a great week – gotta run – just have one extra day in Dubai to do some sight seeing, and eat a little more of the wonderful local food!

