by community-syndication | Dec 23, 2008 | BizTalk Community Blogs via Syndication
Hi all
It is time for the third posting in my series about using BizTalk to integrate to
Excel spreadsheets. My first two postings are here (Installation)
and here (The
schema generation wizard).
This third posting is a talk about the runtime, and how it works.
The setup
To start out, I have done a simple test, and you can find my project
.
It has a simple spreadsheet and a schema for this spreadsheet (both
are described in my previous post) and the setup basically just has a FILE Receive
Location and a send port with a filter that takes everything from the Receive Port
the Receive Location belongs to. My aim is to see how fast the Spread Disassembler
is.
First, a short description of my setup:
My BizTalk installation is in a Microsoft Virtual PC 2007 virtual machine.
The host machine is a Hewlett Packard 8710w laptop with an Intel Core Duo T7700 2,4GHz
CPU, 2GB of RAM and Windows XP Professional Service Pack 3 and completely updated
as of 7’th December 2008.
The guest system is a virtual machine which has one 2,4GHz CPU, 1GB RAM and Microsoft
Windows Server 2003 R2 Enterprise Edition SP2 – also completely updated as of 7’th
December 2008.
The test
I created 999 copies of the same spreadsheet and moved them into a folder watched
by the receive location. They were read, transformed into XML, and output into the
output folder in 3:19 minutes. This is an average of 5 spreadsheets per second.
This took me by surprise – I had expected it to be faster. So I decided to do things
more academic than looking at the time stamps of the output files. After all, there
are PLENTY of functionality that could be the time consumer. So I created a BAM Activity
and View, tracking when my Receive Port starts and when it ends.
A table showing the average processing time can be seen here:
|
Number of messages in test |
Average processing time per message |
Messages per minute |
|
5 |
0,0227 seconds |
2643 |
|
63 |
0,0337 seconds |
1780 |
|
127 |
0,0584 seconds |
1027 |
|
1966 |
0,2714 |
221 |
So it is pretty clear, that performance drops drastically when the load increases.
I do not blame this on the Spread Disassembler, though. Since this
is a virtual PC, with SQL Server on the same box as BizTalk, the mere I/O operations
when writing all the output files to the hard drive conflicts with the I/O operations
of BizTalk using the MessageBox. I find this a much more likely issue for the drop
in performance than that the disassembler should get slower just because more messages
come in.
So, to sum up, it seems that the Spread Disassembler can take a pretty heavy load
– Up to 2643 messages per minute (44 messages per second). This is given less than
ideal operating and hardware conditions, but optimal conditions with regards to the
BizTalk Server not doing anything else at the time.
Maybe in a later post I will take a look at more complex spreadsheets/schemas and
also test the performance of the assembler.
—
eliasen
by community-syndication | Dec 23, 2008 | BizTalk Community Blogs via Syndication
Hi all
The company hosting the eliasen.dk domain went bancrupt last friday (19’th December
2008), and I only just found out this morning (Monday the 22’nd December).
So basically, everything since my last backup (27’th November 2008) is gone!
Right now, I am trying to setup the blog on the new server and trying to see if I
can repost the blog posts I have written since the 27’th November.
—
eliasen
by community-syndication | Dec 22, 2008 | BizTalk Community Blogs via Syndication
With all the developer extensions in recent time around SharePoint (Features, Solutions
etc), I’ve found there seems to be a few little known and little used ‘other’ APIs
within the SharePoint space.
We’ve got things like WebServices and the SharePoint Object Model (SPSite etc) that
we use however, there’s a couple of other APIs that could be useful also for the times
when you’re not running locally on the SharePoint machine – they generally center
around HTTP and extending it.
Two (that come immediately to my mind) are:
1. WebDav – early versions of ‘Web Folders’ used this.
2. RPC (over HTTP) APIs – Front Page and SharePoint Designer still use these.
(InfoPath when submitting forms uses this to promote properties to a forms
library)
A great example of this is SharePad
for SharePoint on CodePlex
Merry Christmas,
Mick.
by community-syndication | Dec 22, 2008 | BizTalk Community Blogs via Syndication
How to look cool at work…. Stick this poster by your desk, people will walk past
and say “Hmmmm……” – guaranteed to reduce the amount of questions you get each
day 🙂
(Then you could start talking about the ‘flux capacitor’ and people will believe……)
The MS folks in Connected Systems Division (CSD) have done a superb job!!! Great comprehensive
poster.
One thing to say about the poster – remember that the ‘Tracking Host Instance’ and
the ‘InProc Host Instance’ are generally within the same Host Instance (it’s good
practice to separate these out)
Enjoy
by community-syndication | Dec 22, 2008 | BizTalk Community Blogs via Syndication
I'm excited to announce that we have now listed our new "Dublin" course on the website. This will be a three day course where we dive in to taking advantage of the hosting features found in the new application server being developed at Microsoft. If you are interested in finding out how you can use "Dublin" to host your WCF/WF services, manage and monitor them, and take advantage of advanced features for routing, versioning your services, etc. then this is the course for you. I'm looking forward to the first public offering which should be posted up on the schedule soon.

by community-syndication | Dec 22, 2008 | BizTalk Community Blogs via Syndication
THe business rule engine in Windows Workflow Foundation is an often overlooked feature that provides powerful business logic processing and rich business rule engine execution semantics. Some mistakenly believe this functionality can only be used from within a workflow. I worked with Microsoft recently on a couple of articles to provide developers hands-on examplesof using rules in .NET applications and in WCF services. These articles are part of a new series intended to provide small concrete examples of how to use the technology. You'll find links in the sidebar to other articles on using sequential and state machine workflows. If you are learning WF , don't forget about our screencast series as well where we have information on WF, WCF and more.

by community-syndication | Dec 21, 2008 | BizTalk Community Blogs via Syndication
I had a little hickup today as I was playing around exploring Web services and Windows Azure. After scanning the Internet, I’m now convinced that most people out there working with Azure are hosting Web apps up there, not Web services, based on the amount of articles I found (unless of course people doing services just aren’t talking about it because it’s just soooo easy :)). Perhaps it was just my search terms, or maybe that’s what most people start with, but I found this interesting and thought I’d put together a post for anyone trying to get started deploying a WCF service to Windows Azure. I’m sure there’s some services-centric, probably SOA-ish folks out there wondering how to do this…
I put together a WCF service, and then deployed it:
- Locally (using the VS.NET dev Web server)
- Locally (use the Windows Azure development fabric)
- Remotely (to Windows Azure).
In this post I’ll go through the steps I followed. To get started, I:
- Created a Cloud service project
- I added a WCF service (note that I changed to basicHttp binding)
- I wrote a little code, ran the project locally, using the development Web server, and it worked no problemo
- I created a test client based on the WSDL exposed above
- I set my cloud service project as the startup project, and in Visual Studio, ran it.
The Azure SDK adds some Visual Studio templates, you get this:
In Visual Studio, my finished solution looks like this:
SIDEBAR: so why the heck is Brian using “Contoso”? More later, but this service is part of a greater whole which I will be blogging about.
For those of you who have not worked with Azure yet, one of the cool features you get is a “development fabric” that basically gets you a local copy of Windows Azure, including storage. Having spent as much time in airplanes as I have in the past year, this is really appealing. When a cloud service project is the startup project, running the solution will launch the development fabric. The UI for it looks like this:
Next, I repointed my test client to point at the address shown above. However, it failed. I got a 405 error, “method not allowed”. Turns out that the cause of this was that IIS7 did not have a handler mapping to SVC files, so didn’t know what to do with them. To fix this, I ran “ServiceModelReg -i” in the “c:\Windows\Microsoft.NET\Framework\v3.0\Windows Communication Foundation” folder. Perhaps it was an installation sequencing issue that led to me not having WCF registered in IIS, or, maybe this is something you just have to do.
After I ran it, my handler mappings looked like this:
After that, I was able to call the service in my local Windows Azure development fabric. Cool!
Next up, how to deploy this to the cloud? You can do this using the CSPack command line utility, but for quick dev there’s an easier way. In Visual Studio, right-click the cloud project and choose “publish” from the context menu:
Now, you might expect choosing “publish” on a cloud service project would do something like, ummm… perhaps “publish the service”. Well, it doesn’t. What it does is actually prepare it for you to publish it in another step, it creates the configuration files you’ll need for that step. It also opens a browser that navigates to the Azure portal, as well as opening a Windows Explorer in the “publish” folder where it wrote out the two config files. You then use the portal to upload the files:
After you upload and start it looks like this in the portal UI:
And, that’s about all it takes. Clicking on the arrows in the middle swap out the “staging” and “production” instances of the deployed apps, which really is nothing more than a change at the load balancer level.
A year from now, it would probably just take a credit card number for this Web service to scale to millions of calls per hour. The future is closer than it appears I think, these are exciting times. Go forth and learn!
Technorati Tags: Azure,Windows Azure,WCF,Cloud,Cloud Computing
by community-syndication | Dec 21, 2008 | BizTalk Community Blogs via Syndication
One of the things that surprises me about BizTalk installations is, in my experience, the limited support they receive once a project has gone live. BizTalk is a large enterprise product and a dedicated team of BizTalk operational specialists and SQL Server DBA’s should be created for the task of maintaining operational and test environments.
In […]
by community-syndication | Dec 21, 2008 | BizTalk Community Blogs via Syndication
I thought I'd blog about this issue I had, since it was in the end so easy to solve, but I had a hard time finding a good description of both my specific problem and any resolution. I am a bit ashamed to say that I got quite creative before trying…(read more)
by community-syndication | Dec 20, 2008 | BizTalk Community Blogs via Syndication
Robert Folkesson , a Swedish Microsoft developer evangelist, wrote recently about a recipe for getting good performance out of a VPC (in Swedish). In summary and translated to English, he suggest that you run the base VPC from an USB memory stick and…(read more)