by community-syndication | Apr 30, 2009 | BizTalk Community Blogs via Syndication
Down at BizTalk 24*7 Saravana Kumar has been working hard to organise a whole collection
of BizTalk articles from us in Cyberspace
He’s done a great job!
Check it out – http://blogdoc.biztalk247.com/
and yours truly is here – Mick
Badran
by community-syndication | Apr 30, 2009 | BizTalk Community Blogs via Syndication
Earlier this week I attended the RFID workshop held at Microsoft’s Reading Offices. I thought Id write up some notes and thoughts I’d made about the event.
Content/Trainer
The course was based on the course developed by the guys at Breeze (http://breeze.net/default.aspx). The material itself is excellently written and is easy to follow. It also has some nice little bits of humour which mean it is not the common/boring material you get on some courses. The material itself lets you build and enhance an RFID solution through out the course of the labs and there are about 10 labs and the course was very hands on.
The trainer for our course was Jeff Johnson from Microsoft in the UK and he delivered the course very well and was able to add lots of additional information based on experiences from work he had done.
The course itself was a 2 day workshop and was free.
Freebies
There were some excellent freebies on this course:
- A copy of the BizTalk 2009 RFID book
- A USB RFID reader/writer
- A small bag of RFID tags
- A BizTalk VPC (2006 R2) with everything on to do additional study away from the course
Attendees
The attendees for the workshop gave us an excellent mix of BizTalk people who had done little or no RFID work and RFID specialists who hadn’t done much with BizTalk.
This resulted in lots of discussion and interesting points based on peoples differing experiences. It was also good to see some people from the UK SBUG user group there.
My Random Thoughts based on what I learnt on the course
I think the first thing to say is that I have wanted to look into the RFID aspects of BizTalk for quite a while, but the combination of me not having any RFID work in the pipeline and my expectation that it would be quite a steep learning curve and that I didn’t have the hardware I would need to play around with it had put me off doing this for a long time. It just kept getting reprioritised.
Jeff had actually contacted me to promote the workshop to our user group members and I thought it was a good time to do some training and being free it ment I would only lose the cost of the days off work.
To my surprise RFID work with BizTalk is sooooo much easier than I expected, so much so that I had to question if it was appropriate to call it BizTalk being so easy.
I don’t really want to go into too much discussion about the aspects of RFID but some things which I feel will catch people’s interest to look more into this might be:
-
Traditionally RFID work had the typical complications of working with hardware vendors. You needed different API’s for each vendors readers etc. BizTalk RFID abstracts this and gives you a standardised API which allows you to communicate with any supported vendor. You could really see the benefits of this and you can code directly against this API from .net applications if required
-
BizTalk RFID is actually a separate module to the main BizTalk installation. The course really discussed setting up RFID on “edge” servers which would be configured to work with a set of readers. You could then send your events to a central hub BizTalk installation to interact with LOB applications if required
-
Following on from the above comment it really looks like you would use RFID with the branch edition of BizTalk on your edge server. This significantly reduces the cost of setting up solutions and also if I remember correctly you don’t need a hub BizTalk enterprise edition server if you don’t need one. This allows customers who may use other vendors for their integration technologies to still take advantage of BizTalk RFID as a low cost solution to work with their current products, it also offers SME companies a cheap option and as they grow it gives them the option to bring “proper” BizTalk into the picture later.
-
BizTalk RFID doesn’t require the traditional BizTalk skill set. There is some configuration requirements to configure your devices and RFID processes, but when it comes to custom processes the components are written in .net code in a similar way to BizTalk pipelines and pipeline components (but much simpler!!)
-
The RFID module gives you an execution environment for processing your RFID events and you can have multiple processes which will pick up the same events depending on how you configure your bindings from devices to processes.
-
The configuration/administration side of things is quite BizTalk like but with out any of the SQL Server requirements
-
There are some testing things that come with RFID which allow you to test your processes without having to have an RFID reader
I’m sure that there is a whole lot of other stuff that I should say but just doesn’t come to mind at the moment.
In terms of recommending the course I would say that it is a very good course and it has really excited me about the possibilities of doing BizTalk RFID solutions.
I have invested 2 days of time and I’m coming away with a good understanding of this and I think with very little effort I can setup a pretty powerful demo to show my customers.
I believe Microsoft might be running this workshop again in the Autumn or if you are outside the UK I’m sure Breeze can advise you how to do this course.
by community-syndication | Apr 30, 2009 | BizTalk Community Blogs via Syndication
Recently, Greg Leake (Senior Director Developer Platform at Microsoft) performed a series of IBM written benchmarks. The results of the benchmarks demonstrate a significant cost savings of running WebSphere on Windows on HP Blades as opposed to running the same workload on IBM's Power 570 hardware. This savings is not only significant but the overall performance in terms of transactional throughput was significantly better than on the Power system. In addition, the tests show the ability to scale the overall system in a granular fashion. The net findings of these tests is that an organization has the possibility to run their WebSphere application workloads on Windows on Intel based hardware for markedly lower costs with improved performance as compared to Power systems. You can find an overview of the work at http://www.websphereloveswindows.com/.
For more in-depth information on the studies, check out http://www.microsoft.com/windowsserver/mainframe/whoknew/, as well as the information found on MSDN around the .NET StockTrader Sample Application. And you can find Greg Leake’s blog here.

by community-syndication | Apr 30, 2009 | BizTalk Community Blogs via Syndication
I never really “got” the idea of the need to build a textual DSL when I was first introduced to MGrammar. The light really switched on when I looked into developing a DSL that would make it easier for developers to create BAM activities.
The BAML language only took a couple of hours to develop. I had experimented with simple text based DSLs before, so this was my first “real” language.
I have recorded a 20 minute webcast showing how the language works, and how it can be used. If you want to experiment with it yourself, the language is here.
|
module BloggersGuides
{
language BAML
{
syntax Main = Activity;
syntax Activity =
ActivityToken
n:NameToken
‘{‘
p:List(PKI)
‘}’
=> { activity { n,{ p }} };
syntax PKI = Milestone | Integer | Decimal | Text;
syntax Milestone = t:MilestoneToken n:NameToken ‘;’
=> { t, { n } };
syntax Integer = t:IntegerTokenn:NameToken ‘;’
=> { t, { n } };
syntax Decimal = t:DecimalTokenn:NameToken ‘;’
=> { t, { n } };
syntax Text = t:TextTokenn:NameToken ‘;’
=> { t, { n } };
syntax List (Element) =
e:Element => { e } |
list:List(Element) e:Element => { valuesof(list), e };
token NameToken = (‘A’..’Z’ | ‘a’..’z’)+;
@{ Classification [“Keyword”] }
token ActivityToken = “activity”;
@{ Classification [“Keyword”] }
token MilestoneToken = “milestone”;
@{ Classification [“Keyword”] }
token IntegerToken = “integer”;
@{ Classification [“Keyword”] }
token DecimalToken = “dec”;
@{ Classification [“Keyword”] }
token TextToken = “text”;
interleave Whitespace = ‘ ‘ | ‘\r’ | ‘\n’ | ‘\t’;
}
}
|
This is the sample input file I used on the webcast.
|
activity ConferenceBooking
{
milestone BookingDate;
text ConferenceName;
text AttendeeCity;
text HotelName;
dec Price;
integer Days;
}
|
You will need the command line compiler to get the BAM activity created, if you contact me I can email it to you.
Regards,
Alan
by community-syndication | Apr 30, 2009 | BizTalk Community Blogs via Syndication
For years I’ve heard people say that Microsoft technology is great for mid range solutions. I can’t tell you how many CIOs have told me things like “we use WinTel for department level applications but the big iron apps run on Unix.” There’s a general belief in the industry that the further you get into the datacenter, the less Microsoft technology you tend to find. While license numbers, data from IDC and other major third parties paint a different story, we decided it was time to put ourselves to the test against one of the biggest players in the space – AIX running on optimized IBM Power6 with WebSphere.
About a year ago, I blogged about some .NET / Windows Server benchmark testing results produced by Greg Leake. After taking some well deserved time off, Greg went back to the lab, expanded his testing and agreed to help get to the bottom of the “back office” debate. For the first time, the results include IBM hardware (POWER6) which allows us to evaluate price / performance using typical customer configurations. In an era of cost cutting and the need to squeeze as much optimization out of systems as possible, we think customers will find this information very interesting.
Let’s start with costs Greg’s findings demonstrate that customers save up to 81% in total system costs by running applications on Microsoft .NET and Windows Server 2008 vs. IBM WebSphere 7 on POWER6/AIX. The study also showed that customers who run their IBM WebSphere 7 applications on Windows Server 2008 and Hewlett Packard/Intel 64-bit blade servers can save up to 66% in total system costs when compared to running the same applications on IBM WebSphere 7 on an IBM Power6/AIX platform. These results illuminate significant savings for businesses of all sizes, and particularly speak to the value of the Windows Server platform. Our hope is that people are able to make use of these findings to get more for their money, either by making new investments or by maximizing assets they are already using. Who knew WebSphere and Windows Server were such a match?
Most folks tend to buy into the potential for cost savings but speculate that they give up performance to get it. The study also found that for the hardware configurations tested the Microsoft .NET Framework on Windows Server 2008 handles 57% more load than WebSphere 7 running on IBM Power6/AIX. We also found that Windows Server 2008 running WebSphere 7 handles 37% more load than IBM WebSphere 7 running on POWER6.
So, what do these findings really mean and why am I sharing them with you today? A few important things to know:
1) Windows Server and the .NET Framework continue to be a powerful, leading combination for application development, deployment and management.
2) Customers who have made a bet on WebSphere can improve performance and reduce costs by running WebSphere on Windows.
Please visit www.websphereloveswindows.com to read more about these results and check out the .NET StockTrader downloads. We think the .NET StockTrader is a GREAT example of how small bits of innovation can have BIG impact. Don’t take our word for it – these sample applications and guidelines are available to anyone. Instructions on how to replicate the testing we conducted are also available here, and I strongly encourage customers, and all third parties, to conduct tests for themselves. Let us know what you think
by community-syndication | Apr 29, 2009 | BizTalk Community Blogs via Syndication
There was a failure executing the receive pipeline: “Microsoft.BizTalk.DefaultPipelines.XMLReceive, Microsoft.BizTalk.DefaultPipelines, Version=3.0.1.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35” Source: “XML disassembler” Receive Port: “EDI837P_ReceivePort1” URI: “\\SERVER\folder\*.xml” Reason: Finding the document specification by message type “EDIMsg” failed. Verify the schema deployed properly.
Make sure that you don’t have the same schema deployed twice!
by community-syndication | Apr 29, 2009 | BizTalk Community Blogs via Syndication
(link to new extension file at the bottom of the post)
Jean-Pierre Auconie (JP) , the creator of the Message Box Viewer tool for BizTalk, has released a supporting tool called the Message Box Viewer Query Builder. JP has also posted a quick help about the tool. On a previous post I went over how to […]
by community-syndication | Apr 29, 2009 | BizTalk Community Blogs via Syndication
Over on the ILOG blog, Chris Berg called out Andrew Siemer’s post at http://geekswithblogs.net/AndrewSiemer/archive/2009/03/30/ilog-rules-for-.net-3.0-ndash-quick-overview.aspx. I thought I’d post some observations by way of a response.
See http://geekswithblogs.net/cyoung/archive/2009/04/29/131593.aspx…
by community-syndication | Apr 28, 2009 | BizTalk Community Blogs via Syndication
Dimension Data has been doing some great work for us around SQL Consolidation. They are holding an updated Techspresso session covering the topic. See below:
https://techspresso.didata.com.au/Techspresso/Melbourne/MLB_2009_05_27.htm
Lower Your Database Costs with SQL Server Database Consolidation
Discover how your organisation can do more with less at Dimension Data’s SQL Server 2008 Consolidation Seminars.
This morning Tech’spresso session, hosted by our Principal SQL Server Consultant, Rolf Tesmer, will reveal how server consolidation with SQL Server 2008 provides greater flexibility, centralised management, superior performance, increased scalability and reduced TCO.
Rolf will cover a range of topics, including discovery options, practical approaches and a real-life case study. The session is ideally suited for SQL technicians, CIOs, CTOs and any IT staff dealing with SQL Server environments. It’s a great chance to meet and exchange information with your peers and best of all, breakfast and coffee is on us.
Date: Wednesday, 27 May 2009
Presenters
Rolf Tesmer – SQL Principal Consultant Dimension Data
Ron Dunn SQL Specialist Microsoft
Time
8:00 AM – 10:00 AM
The Venue
Dimension Data Ground Floor 11-17 Dorcas Street South Melbourne VIC 3205
by community-syndication | Apr 28, 2009 | BizTalk Community Blogs via Syndication
Six years ago we announced a multi-year, multi-product vision for the Enterprise which we called Dynamic IT. The principles were simple – virtualization, model driven, services oriented and user centric. Shortly thereafter, we began to conceptualize a services-based offering in the cloud which culminated in our Azure Services Platform announcement last October. These two initiatives might seem distinct, but they are in fact highly related. Simply put, this is the continued evolution from physical to logical to virtual.
In addition to the interest in cloud computing, there is even more interest in the application of cloud computing principles in the enterprise data center. As I have mentioned previously, at some point in the future, the Azure Services Platform and an enterprise data center will be, technically speaking, largely indistinguishable. Both will:
%u00b7 Be highly virtualized and elastic
%u00b7 Be managed in a consistent manner within and across the firewall
%u00b7 Hide the complexities of hardware infrastructure from the applications they serve
We are learning a lot from the investments that we are making in Azure and will use these learnings to drive additional benefits for customers not just in the cloud but also with our premises technologies. One of our primary objectives is to deliver the technology that empowers Enterprises to build private clouds within their existing datacenters.
While there are a lot of vendors talking about private clouds, let’s think this through a bit. Would you buy beef from a vegetarian? I digress here’s the point – the knowledge that we gain from running a public cloud will yield better technology for the private ones that we help customers deliver.
How does Microsoft deliver this today?
%u00b7 Hardware Abstraction: Delivered in Windows Server 2008 with Hyper-V
%u00b7 Logical Pooling of Compute: Delivered through management tools like VMM, you can connect the compute power from your servers into a single, logical resource
%u00b7 Automated Provisioning of Resources: Delivered this with tools like Intelligent Placement in VMM allowing you to expand and contract workloads across your fabric
With Windows Server 2008 R2 our fabric capabilities become even stronger. In this release, we deliver enhancements to the native virtualization capabilities:
%u00b7 Live Migration
%u00b7 Larger VM Support: 32 and 64-bit VMs, with up to 64GB memory per VM
%u00b7 Boot from VHD & Clustered Shared Volumes (core enhancements from Windows Azure)
To net this out, we’re bringing the lessons learned from our public cloud to the places where they will likely deliver the most benefit in the near term – right in your data center. As we evolve the technology that drives Azure, you can count on continued innovation and evolution of our premises technology that will make private cloud computing a reality.