by community-syndication | Apr 27, 2009 | BizTalk Community Blogs via Syndication
Earlier today, the DMTF announced the creation of the “Open Standards Cloud Incubator” group which will specifically focus on developing a set of informational specifications for cloud resource management. This is one of what will probably be a number of important efforts to drive additional value and choice for customers. This particular effort may catch the eye of folks on the enterprise side, as managing applications and infrastructure that spans premises and cloud is a very real topic of conversation. In the end, success in the cloud for most enterprises will include the ability to utilize a broad and diverse set of computational resources, some of which may be implemented very differently from others. A sensible goal of the DMTF is to reduce the friction across different vendor offerings in these datacenter scenarios.
As we have said previously, Microsoft will approach cloud standardization from multiple angles:
%u00b7 Practical interoperability – Microsoft-based services should be easily interoperable in practice with a diverse set of applications, platforms, and other clouds. We envision that our customers will integrate applications across multiple data centers, for example, having some of the component services running on Microsoft technology while other components run on Amazon, Google, Salesforce, or other data centers. Microsoft will work with other cloud vendors to produce guidance on building applications, to define the interoperability protocols, and to test real-world interoperability.
%u00b7 Standards–Microsoft will continue to invest in organizations, like DMTF, that help push the interoperability state of the art forward. Management is a great example of this, as there will likely need to be further evolution of the management protocols, such as WS-Management, to add cloud-specific extensions.
%u00b7 Data portability — Microsoft believes that our customers own the data that they have entrusted to our applications and platforms, and the applications they build on our platforms. Microsoft will work with other datacenter and cloud vendors to make this bi-directional, customer-centric approach a more common industry practice.
While it’s still very early to talk about elaborate technical standards for cloud computing, establishing the conduits for the conversations and outlining a collaborative approach is critical. Projects like the DMTF incubator will develop requirements and use case scenarios to allow all the participants to better understand where standards for communication with cloud services can create the most value.
As always, we pledge to be open, collaborative and transparent about our efforts in the cloud standards space. If you have thoughts on where we should focus additional attention, please let us know!
by community-syndication | Apr 27, 2009 | BizTalk Community Blogs via Syndication
A while back Matt Milner and I coauthored a whitepaper on the new WCF LOB Adapter SDK and the BizTalk Adapter Pack. It was published on microsoft.com a few weeks ago but I guess I missed it. You can download the paper from here. Here’s a brief overview:
When building applications today, it’s hard to consider building something that doesn’t involve connectivity of some sort. Applications require business data and logic that is distributed across several applications or servers. Unfortunately, not all systems provide the same interface to their data and business logic which ultimately forces developers to figure out how to talk to each of those systems. Connecting to a system doesn’t just involve opening up a port on a network address either; we have to work with different message formats, varying security mechanisms, and in many cases custom libraries that rely on proprietary mechanisms. In the end, it’s common for developers to struggle with learning a variety of different programming interfaces, communication protocols, and messaging semantics.
Windows Communication Foundation (WCF) promises to change all of that. WCF provides a unified programming model for building distributed applications using the .NET Framework. WCF was designed to provide a single unified programming model for writing either clients or services while also providing a flexible framework for different styles of communication on the wire. This approach allows developers to focus on writing code in their business domain rather than on learning new networking interfaces or object models. The code you write with WCF always looks the same but you can configure your apps to use different transport protocols like TCP, HTTP, and MSMQ; different message encodings like XML, MTOM, and binary; and varied security options including certificates, passwords, and security tokens.
The goals addressed by WCF are also very similar to the integration goals of Microsoft%u00ae BizTalk%u00ae Server. Ultimately BizTalk Server is primarily focused on providing an easy-to-manage model for connecting disparate, heterogeneous systems using a variety of different protocols, message formats and security mechanisms without requiring much, if any, code. This whitepaper discusses how the worlds of WCF and BizTalk Server 2006 are fully converging through the WCF LOB Adapter SDK and the BizTalk Adapter Pack.
I hope those of you working with WCF, BizTalk, and BizTalk adapters find this paper helpful.

by community-syndication | Apr 27, 2009 | BizTalk Community Blogs via Syndication
I will be speaking at Microsoft DevDays 09 in Hague on May 27-29. You can read more about the show here.

I will be delivering a pre-conference workshop on the Azure Services Platform, and I’ll be presenting several individual sessions on WCF/WF 4.0, .NET Services, “Dublin”, and the WCF REST Starter Kit. Hope to see you there!
If you find me at the conference, I’ll give you a substantial discount card for a Pluralsight On-Demand! subscription.

by community-syndication | Apr 27, 2009 | BizTalk Community Blogs via Syndication
A friend of mine, Jon Helmberger, posted on this Facebook status the other day a great quote which really pointed out how some people use technology that is trendy, and not necessarily the most appropriate technology. “If RSS had a cooler name we wouldn't have shenanigans like this…” with the following link: http://tiny.cc/kxALC. In a nutshell, local Minnesota municipalities are posting information, of varying usefulness, on Twitter and Facebook.
I realize it must be hard for organizations that are not on the cutting edge of technology to make decisions about what to pick. For that matter, I have been talking to a lot of developers lately who have trouble keeping up with all the technologies, even from a single vendor like Microsoft. But it is frustrating to see people gravitate to the hot item and try to use it without really figuring out if it is the right technology for the job. I can’t imagine that with the 140 character limit in Twitter, an organization can convey much useful information. If the posts always end up linking to something else, how useful is that? It seems to me that having an RSS feed would be the more appropriate mechanism for conveying this type of information. There are so many tools for reading RSS / ATOM feeds and including them in a page, etc. Sure, a Twitter feed can be read as an RSS feed, but again, the micro format seems like an inappropriate means conveying anything other than the simplest bit of information.
We at Pluralsight have started using these technologies (Facebook and Twitter) to convey information about what is happening with classes, content, etc. Of course we are mostly on the cutting edge of technology and hopefully have some idea of the best way to use these technologies. However, even we are still learning how best to use these technologies and which information is best suited for each format.
What do you think, is Twitter an appropriate tool for this sort of thing? Do enough, or the right, people use it to make it worthwhile? Or are too many people caught up in the hype? What tools do you find work best for you to get information?

by community-syndication | Apr 27, 2009 | BizTalk Community Blogs via Syndication
Recently, Steven Martin at Microsoft talked about a cloud manifesto that would presumably define some agreement around open standards for interoperability in the “cloud”. Aside from all of the politics and secrecy, I’m wondering what this would actually look like. I’m all for open standards and I think the big players should be involved in the discussion. But what standards do we need that we don’t already have? And do more standards really make things better?
The actual manifesto can now be found at http://opencloudmanifesto.org and provides a quick look at the actual statements and a list of supporters. I can’t help but notice that not only is Microsoft not on the list, but neither are Amazon, Google, and some other major players in the cloud space. It makes me really question the purposes behind this manifesto and why it was created. A lot of the points are certainly valid and easy to agree with such as the need for security and interoperability. I think what most bugs me about this is the idea that a user should be able to pick up their application from one cloud and drop it in another. Obviously I do my work almost exclusively on the Microsoft platform, buy my sense is that the whole write once, run anywhere thing didn’t work for Java, so I’m not too optimistic about it working in the cloud.
I’m of the mind lately that standards, while helpful can also become overkill. Look at SOAP. Having interop for security enables the connection of code on different platforms which is a great thing. But then we have standards for transactions, reliable messaging, etc. which are primarily useful within the enterprise to connect disparate systems. Yet many today are finding SOAP to be overkill and turning to REST. REST still uses standards like HTTP and often XML, but the architecture and the implementation is based more on using what works and keeping it simple.
I’m all for a cloud where users can have their application connect and interoperate between clouds and private data centers in a secure fashion, but like the manifesto says, I don’t think we need new specifications, we have most of what we need to achieve that interoperability and Microsoft at least is showing its commitment to those existing standards with the .NET Services and Azure platform. In the end, I think this will, and should, come down to who has the best platform, and who has the best tools for developing on that platform. I think Microsoft has some great potential in this arena, and I’m excited to watch the Azure platform mature.
Time will tell.

by community-syndication | Apr 27, 2009 | BizTalk Community Blogs via Syndication
The BizTalk Customer Advisory Team and the BizTalk UA Team are pleased to announce the Release To Web of the “Microsoft BizTalk Server 2009 Hyper-V Guide”.
The guide provides relevant information to IT professionals to enable them to make educated decisions about the advantages and tradeoffs of using Windows Server 2008 Hyper-V to virtualize BizTalk Server environments. This guidance is a result of 6 months of effort including a 6-week performance lab conducted by the BizTalk Customer Advisory Team.
by community-syndication | Apr 27, 2009 | BizTalk Community Blogs via Syndication
So, looks like today was the formal release of BizTalk Server 2009. It’s been available for download on MSDN for about a month, but this is the “general availability” date.
The BizTalk page at Microsoft.com has been updated to reflect this. Maybe I knew this and forgot, but noticed on the Adapters page that there doesn’t […]
by community-syndication | Apr 27, 2009 | BizTalk Community Blogs via Syndication
My latest MSDN Magazine article entitled SOA Simplified: Service Virtualization with the Managed Services Engine hit the Web towards the end of last week. It focuses on simplifying large-scale SOA ecosystems by leveraging the concept of service virtualization as an abstraction for managing services over time. Microsoft Services provides a technical solution for implementing service virtualization called the Managed Services Engine (MSE). This article introduces the concepts of service virtualization and shows you how to get started with the MSE today. Hope you enjoy it.
You can download the MSE bits from CodePlex along with some installation guides, documentation, and some helpful screencast videos that illustrate exactly how to get started. Definitely worth checking out.

by community-syndication | Apr 27, 2009 | BizTalk Community Blogs via Syndication
Download it : MBVQueryBuilder.zip
Hello,
This is the First version of MBVQueryBuilder tool, to integrate your custom Queries in MBV tool.
The zip to download contains :
– The tool itself MBVQueryBuilder.EXE
– Runtime DLLs : MBVEngine.DLL and MYHC.DLL
MYHC.DLL is in fact the runtime dll implementing My Health Check Framework and MBVEngine.DLL is the runtime dll using MyHC.DLL but is dedicated to analyze a BizTalk system.
– Versions of MBV Gui and MBV Console using the runtimes DLLs: MBVGUI.EXE and MBVCONSOLE.EXE
As MBVEngine.DLL and MYHC.DLL are both shared by MBVQueryBuilder.EXE and MBV client tools MBVGUI.EXE and MBVCONSOLE.EXE,
MBVEngine.DLL and MYHC.DLL must be so both located in same folder than the tools using them.
You can find here a quick help about this tool at :
http://blogs.technet.com/jpierauc/archive/2009/04/15/mbvquerybuilder-tool.aspx
Let me know please your feedbacks, questions, suggestions, and possible bugs found about this new tool
Thanks !
JP

by community-syndication | Apr 27, 2009 | BizTalk Community Blogs via Syndication
Hello,
As mentioned in a previous post //blogs.technet.com/jpierauc/archive/2009/04/01/comming-soon-tool-to-build-your-own-queries-for-mbv.aspx, I make available publicly now on my blog a tool complementary to MBV which will allow you to create easily and quickly your custom queries for MBV.
What is MBVQueryBuilder ?
This tool allow in fact to create quickly and easily additional queries for MBV tool
This tool can create and update a Query Repository XML file with all queries you will create and allow also to generate the queries you want in an XML MBV Extension file starting with name “MBVEXT”.
Why MBVQueryBuilder ?
MBV use a custom Health Check engine I developed (“MyHC” Engine) which is easily extensible in tem of queries and rules.
This extensibility is made possible via any XML file starting with name “MBVEXT” and located in same folder than MBV.
These MBV Extension File contain additional custom queries (with their rules if exist) which will appear in additional Queries tabs in MBV gui interface.
To generate easily and quickly such MBV Extension XML files, it became obvious to create a tool to do it.
Where to download laetst version of this tool ?
http://blogs.technet.com/jpierauc/archive/2009/04/27/mbvquerybuilder-latest-version.aspx
Where can I find a quick help on this tool ?
http://blogs.technet.com/jpierauc/archive/2009/04/15/mbvquerybuilder-tool.aspx
Ler me know please your feedbacks, questions, and suggestions about this tool
Tha%u00a8nks!
JP
