Higher Standards for Web Standards

Since the emergence of web services in the 90s, we’ve seen an explosion of standards and standards bodies. Sometimes, they emerge based on new innovations, other times they’re created to unblock a stalemate on a similar standard or organization. Occasionally they are created simply to change the technology landscape in a way that is more favorable for certain vendors.


 


A question that I am asked over and over is – “Does Microsoft support standard X?” or “Is Microsoft going to join standards body Y?” The question we should spend more time debating is “What are the key technology or interoperability gaps and how do we fill them?” As new initiatives emerge, we research the business and technical need before taking action. We do this by putting ourselves in the shoes of actual developers and IT Pros and asking “what are the barriers I’m facing today, and what do I need to solve them?” Pragmatism over theory, always.


 


In many cases, the right answer isn’t necessarily to define something new, but to instead carefully consider whether technology or initiatives already exist to solve the problem. In the end, we should judge the strength of standards on industry and customer adoption alone. As an example, IBM recently announced a consortium called “WSTF”:  Web Services Test Forum which leaves us a tad puzzled.


 


As of today, the WS-* standards are largely complete within W3C, OASIS, WS-I, DMTF, etc. and are widely implemented in infrastructure products and used by organizations all over the world. We were thrilled to participate in the Oasis announcement just last week on WS-RX, WS-TX and WS-SX. With regard to testing, we think it is critical that customers be able to propose scenarios that match their real-world interoperability needs. Equally important – both successes and failures must be made public. This is why we’re still evaluating our participation in WSTF.


 


Microsoft and other vendors have been participating in a variety of forums for quite some time to help crack the interoperability code. A few examples of forums that have yielded real world results for developers over the years are:


 


          WS-* specification development at W3C and OASIS.  This formal process defined the protocol specifications for enabling service composition through addressable, secure, transacted, reliable, policy-based, end-to-end messaging.


          WS-I is the base layer process for integration and interoperability, upon which other, more domain-specific or scenario-specific tests, profiles, and guidance are built and the primary WS-* interoperability testing focus for Microsoft.


          Interoperability Plug-fests are more informal events at which multiple vendors get together to test interoperability against all other interested attendees, using agreed-upon scenarios for current and forthcoming products. The test tools that Microsoft developed remain available at http://131.107.72.15/endpoints/Default.aspx. These endpoints (and similar endpoints from other vendors) implement dozens of scenarios that customers and vendors can use to validate interoperability.


          Greg Leake runs one of the largest interoperability labs in the world and publishes results and guidance on WS* / WebSpehere / .NET interop. Stay tuned for more here – Greg is just completing his work on WebSphere 7.


 


Separately, but potentially equally interesting An interoperability project that Microsoft recently joined is the Apache Stonehenge incubator effort. We look forward to expanding our efforts in partnership with other vendors on this front. Here’s the latest.


 


Do we need additional standards?  The answer is almost certainly yes.  But before touting a new standard or standards organization, vendors need to be clear about what specific issue is being solved and hold all parties accountable for doing so.  Public access is a key criterion we have in mind as we think about WSTF. So what can you do? Continue to contribute at all levels; standards are only as good as the community formed around them. As always, let us know what you think.


 


 

The new-and-improved Pluralsight website

The new-and-improved Pluralsight website

One of the reasons I haven't been blogging as much as as I'd like is because all of us at Pluralsight have been heads down for a while working on a fairly major update to the Pluralsight website. Yesterday we finally deployed the latest and greatest bits, which you can check out by simply browsing to http://pluralsight.com. We hope you enjoy the changes. Fritz and Keith deserve most of the credit for any improvements you see. We've been primarily focused on making the site easier for our customers to navigate, especially for our growing number of Pluralsight On-Demand! subscribers, and those trying to learn more about it.

If you have feedback on what you like / don't like, please send it my way. We're continually looking for ways to improve things and value the feedback from those using our site.

Thanks to all of you who have shared your passion and skills with us over the past several months. You've helped us clarify our product, our message, and to move our company in the right direction. I consider myself lucky to work with some of the best people in this industry.

Presenting at Code Camp Oz 2009

For all those that have attended the Code Camp Oz events in the past, this years event is shaping up very nicely, lots of great topics and speakers.

I will be presenting the following 2 topics:

  • Saturday – 15:15 Full Session – A Dive into Dublin: WF and WCF Application Server
  • Sunday – 12:00 Short Session – Windows 7 & Windows 2008 R2: Booting from VHD

Looking forward to catching up with everyone in Wagga Wagga.

Solution Clone v1.0

I’ve had a little utility I’ve used on my consulting gigs from time to time that I wrote a while back, and I’ve finally decided it needed a home of its own.  As you can read below, it allows me to keep fairly complex pre-built project structures, and then duplicate them with a click of a button, renaming them to whatever I need.  You can download the latest release, or you can check out the project on CodePlex.

Project Description
A utility to allow you to clone an existing solution, renaming it as you do so, and updating references inside the various files.

The Problem

Do you have a favorite project structure you setup every single time you start a new project? I surely did, and I got tired of having to re-create that structure every time. Especially since my structure was a complex many levels deep set of build files and other support files and projects I used when implementing the BizTalk Deployment Framework.

The Solution

Tired of that work, because I’m a lazy programmer, I created this project that would duplicate an existing solution directory. I setup an archive project called SolutionNameHere, which contained projects like SolutionNameHere.Orchestrations, and then let this program translate every reference to "SolutionNameHere" in either a path or a filename (or even some file contents) to whatever new name I wanted.

File Types

At the current time, Solution Clone is aware of the following types of files within your projects that it does more than simply rename and move. These files will be modified to update any references to the old solution name as they are moved. These file types currently include:

  • .sln – Visual Studio Solution Files
  • .vbproj – Visual Basic Project Files
  • .csproj – C# Project Files
  • .build – Build files used by NAnt and other build mechanisms
  • .nant – NAnt scripts

This list will almost certainly be updated in future versions to include other types of files.

Supporting both active and passive scenarios in my STS

In a comment on a previous blog post Travis Spencer asked

Can you explain more about how you implemented an STS that supports both active and passive scenarios?

So here’s how –

To start with – I’ve implemented my STS class with all the logic I needed; this was done as a class library with several classes – my STS implementation, my STS-Configuration class, an STS service factory, my custom WindowsUserNameSecurityTokenHandler implementation and all the classes I needed to support my custom configuration section.

Then, in order to support an active scenario, I’ve created a WCF service and, through the SVC file, I’ve configured it to use my STS service factory class –

<%@ ServiceHost Language=”C#” Debug=”true” Service=”<My STS configuration class>” factory=”<My STS Factory class>”%>

I’ve then configured the web.config of the wcf service to support my scenario – that included all the relevant binding configuration I needed, the Geneva framework related configuration (microsoft.IdentityModel) as well as any custom configuration my STS uses.

The passive scenario can seem a little bit more confusing –

Obviously I’ve started by creating an asp.net web application; this application basically has two web pages (admittedly I’m simplifying things a little bit for clarity) – default.aspx and login.aspx

Using standard asp.net forms authentication the web site is configured to redirect all unauthenticated users to the Login.aspx page, which in turns has a pretty standard login implementation using my custom username validator logic and the framework’s RedirectFromLoginPage function to set the local forms authentication cookie.

All my web-based reliant parties redirect the user to the default.aspx page; forms-auth then redirects again to login.aspx for authentication and then, once authenticated, the user is redirected back to default.aspx; on this page I’ve simply put the FederatedPassiveTokenService control provided with the geneva framework configured to use my STS configuration class as the service; this takes care of calling the STS and posting the token back to the RP

I hope that makes sensedo let me know if it does not!

WebCast – BizTalk 2009 – Build & Deploy automation with Team Foundation Server 2008

Johan Hedbergs’s very instructive webcast about BizTalk 2009 build and deploy automation, has been posted on Channel9. I especially recommended it to everyone who could not make it to our last user group meeting.

http://channel9.msdn.com/posts/johanlindfors/MSDN-TV-Use-Team-Foundation-Server-for-BizTalk-Server-2009-development/

New video on using Team Foundation Server for BizTalk Server 2009 development with

New video on using Team Foundation Server for BizTalk Server 2009 development with

 
The Channel 9 Screencasts has a new video on using Team Foundation Server for BizTalk Server 2009 with Johan Hedberg from the BizTalk User Group in sweden.
Johan goes over the basics of creating a new build type and building BizTalk 2009 projects, unit testing / continuous integration features, and the extra steps required to […]