256 Worker Role 3D Rendering Demo is now a Lab on my Azure Course

Ever since I came up with the crazy idea of creating an Azure application that would spin up 256 worker roles to render a 3D animation created using the Kinect depth camera I have been trying to think of something useful to do with it.

I have also been busy working on developing training materials for a Windows Azure course that I will be delivering through a training partner in Stockholm, and for customers wanting to learn Windows Azure. I hit on the idea of combining the render demo and a course lab and creating a lab where the students would create and deploy their own mini render farms, which would participate in a single render job, consisting of 2,000 frames.

The architecture of the solution is shown below.

As students would be creating and deploying their own applications, I thought it would be fun to introduce some competitiveness into the lab. In the 256 worker role demo I capture the rendering statistics for each role, so it was fairly simple to include the students name in these statistics. This allowed the process monitor application to capture the number of frames each student had rendered and display a high-score table.

When I demoed the application I deployed one instance that started rendering a frame every few minutes, and the challenge for the students was to deploy and scale their applications, and then overtake my single role instance by the end of the lab time. I had the process monitor running on the projector during the lab so the class could see the progress of their deployments, and how they were performing against my implementation and their classmates.

When I tested the lab for the first time in Oslo last week it was a great success, the students were keen to be the first to build and deploy their solution and then watch the frames appear. As the students mostly had MSDN suspicions they were able to scale to the full 20 worker role instances and before long we had over 100 worker roles working on the animation.

There were, however, a few issues who the couple of issues caused by the competitive nature of the lab. The first student to scale the application to 20 instances would render the most frames and win; there was no way for others to catch up. Also, as they were competing against each other, there was no incentive to help others on the course get their application up and running.

I have now re-written the lab to divide the student into teams that will compete to render the most frames. This means that if one developer on the team can deploy and scale quickly, the other team still has a chance to catch up. It also means that if a student finishes quickly and puts their team in the lead they will have an incentive to help the other developers on their team get up and running.

As I was using “Sharks with Lasers” for a lot of my demos, and reserved the sharkswithfreakinlasers namespaces for some of the Azure services (well somebody had to do it), the students came up with some creative alternatives, like “Camels with Cannons” and “Honey Badgers with Homing Missiles”. That gave me the idea for the teams having to choose a creative name involving animals and weapons.

The team rendering architecture diagram is shown below.

Render Challenge Rules

In order to ensure fair play a number of rules are imposed on the lab.

%u00b7 The class will be divided into teams, each team choses a name.

%u00b7 The team name must consist of a ferocious animal combined with a hazardous weapon.

%u00b7 Teams can allocate as many worker roles as they can muster to the render job.

%u00b7 Frame processing statistics and rendered frames will be vigilantly monitored; any cheating, tampering, and other foul play will result in penalties.

The screenshot below shows an example of the team render farm in action, Badgers with Bombs have taken a lead over Camels with Cannons, and both are leaving the Sharks with Lasers standing.

If you are interested in attending a scheduled delivery of my Windows Azure or Windows Azure Service bus courses, or would like on-site training, more details are here.

BizTalk Community Series: Introducing Ritu Raj

The BizTalk community series blog posts have been running for a number of months now. Since January I have introduced 20 BizTalk community members. Yet there are a still more stories to follow the next couple of months. There are a lot of IT professionals around the world dedicated towards BizTalk. Many of them contribute to community through blogs, forums, wiki’s, articles, books and giving presentations.

The people I interviewed and talked to for these series are representatives of the BizTalk community. They are committed to BizTalk and to its community by providing their knowledge and experience through channels I just mentioned. Today I have another story for you on an enthusiastic BizTalk professional from India: Ritu Raj.

Ritu is 27 years old engineer working in the IT industry for the last 5 years. He is currently
based out of Pune, Maharastra, India. After graduating Ritu started working with BizTalk at KEANE as a Software Engineer. He then moved on to work with CompuGain and Mahindra Satyam as a Senior Engineer working on BizTalk 2009 and 2010. Having worked with Microsoft for a relatively short period, Ritu moved on to join “Syncada from VISA” as a Senior Developer/Architect using his experience and expertise on BizTalk to develop and design solutions.

Ritu is a dedicated follower of the TechNet wiki and tries to help on the BizTalk forums. In past he has made a few blog posts. He is a core developer through out and that is what he loves doing.

“Developing is what I am best at. I have been actively working on architecting solutions on BizTalk and related technology.”

Ritu’s view on BizTalk is as follows:

“For me BizTalk is the best tool available in market for bizness (read Business) talking. BizTalk with the evolution of technology and science is remarkable. I think myself to be a privileged to see this transition and working with the same. BizTalk has adapted itself within the moving technology space. It has been growing more robust and reliable with every new release.”

Ritu loves spending time with his loved ones and friends. He likes reading articles on new trends in technology. Besides that he loves soft music, action and historic movies.

Like many fellow Indian (BizTalk) professionals Ritu is a cricket fan and loves playing it in his free time. He is a hard core Indian Cricket team fan and likes to see the  England team win, if India is not the opponent. Besides criket he follows lawn tennis and Roger Federer is his all time favorite sports person.

A final comment from Ritu and expressing his appreciation for these series:

“Also, a big thanks to Steef-Jan for featuring me on introducing series and starting such a great initiative to bring BizTalk community closer and closer by each day. I would like to thank you for all the great posts and work you have done towards helping out the community.”

Thanks Ritu for your time and contributions so far, keep it up.

Installing Windows 8 using USB

When you buy a netbook it’s most likely there won’t be DVD drive. So it’s getting more important to have the ability to install a OS from a USB device. In the past it’s was a real pain to get the USB drive bootable, you need to fiddle around with Diskpart utility and follow the […]

The post Installing Windows 8 using USB appeared first on BizTalk360 Blog.

Blog Post by: Saravana Kumar

BizTalk 2010 R2 CTP: Azure Service Bus Integration-Part 5 Sending messages to Service Bus Queues using Sessions

BizTalk 2010 R2 CTP: Azure Service Bus Integration-Part 5 Sending messages to Service Bus Queues using Sessions

 

What are Service Bus Sessions?

Service Bus Sessions are actually a rather broad subject as there are a few different scenarios in which they can be used.  At its simplest description I consider Service Bus Sessions to be a way to relate messages together.  More specifically here are a few ways in which they can be used:

  • To address the Maximum message size constraint.  Service Bus Queues can support messages that have a size of 256 KB or smaller.  Using Sessions allow us to break a larger message down into smaller messages and then send them over the wire.  A consumer, or receiver, can then receive all of these message “chunks” and aggregate them together.
  • To support receiving a related set of messages in First In First Out (FIFO) fashion
  • Allows for affinity between a consumer and a Service Bus Queue in competing consumer scenarios.  Imagine having 3 consuming clients all trying to receive messages from the same Service Bus Queue.  Under normal circumstances you cannot be assured that one receiver will receive all messages within a message set.  One can expect the messages to be distributed amongst the clients as each consumer “competes” for the right to process a particular message.  In this scenario, once a receiver has started to process a message within a session, that consumer will process all messages within that session barring some sort of application crash.
  • In some scenarios, using a Service Bus Session allows for routing of messages.  Within a receiver, you can specify an explicit Session that you are interested in.  So in some ways a Session can be used almost like a filter.  I am not suggesting that this approach be used instead of Service Bus Topics/Subscriptions, but there may be a specific business requirement to do this.

Why are Service Bus Sessions important in BizTalk processing?

BizTalk deals with a variety of different messaging scenarios in many different industry verticals.  Supporting Service Bus Sessions is just another tool in the the BizTalk toolbox for supporting new requirements.  A scenario that I came up with is dispatching messages.  For instance if we wanted to load up a field worker with all of his orders, wouldn’t it be nice to have all of his orders sent to him as a batch?  As opposed to him receiving some of his orders only to receive more orders later on.  For instance he may have driven by one of his customers already because the messages that he receive were out of order and other field workers were also receiving their orders which delayed him in receiving all of his.

image

Putting it together – Modifying Service Bus Queue

A pre-requisite for this type of messaging scenario to work is configuring our Service Bus Queue to support Sessions.  This can be enabled in a couple different ways:

  • When creating a queue from within the Azure Portal, we can check the Enable sessions checkbox.

image

  • When using the QueueDescription class we can set the RequiresSession property to true.

NamespaceManager namespaceClient = new NamespaceManager(serviceUri, credentials);

if (!namespaceClient.QueueExists(Sender.QueueName))
        {

            QueueDescription queueDescription = new QueueDescription(Sender.QueueName)
            {
                RequiresSession = true
            };
            namespaceClient.CreateQueue(queueDescription);
          
        }

BizTalk Configuration

In order to keep the solution very simple, we will create:

  • Two Receive Ports
  • A Receive Location for each Receive Port.  The purpose of these Receive Locations is to simply inject messages into BizTalk so that we can set the SessionID property on the Send Ports.
  • 2 Send Ports
    • One for Mike
    • One for Joe
  • Each Send Port will have a filter for the corresponding Receive Port.  Within each Send Port we will configure a SessionID .

image

The other Send Port will use the same URI, however it will have a different SessionID value which will be Joe.

image

 

Service Bus Queue Client

The code below will make a connection to our session-ful Service Bus Queue and retrieve all messages that have a SessionId of Mike.

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using Microsoft.ServiceBus;
using Microsoft.ServiceBus.Messaging;
using System.Runtime.Serialization;
using BrokeredMessageToBizTalk;

namespace RetrieveServiceBusSession
{
    class Receiver
    {

        const string QueueName = “<your_sessionful_queue>”;
        static string ServiceNamespace = “<your_namespace>”;
        static string IssuerName = “<your_issuerName>”;
        static string IssuerKey = “<your_IssuerKey>”;
        static void Main(string[] args)
        {
            TokenProvider credentials = TokenProvider.CreateSharedSecretTokenProvider(Receiver.IssuerName, Receiver.IssuerKey);
            Uri serviceUri = ServiceBusEnvironment.CreateServiceUri(“sb”, Receiver.ServiceNamespace, string.Empty);

            MessagingFactory factory = null;

            factory = MessagingFactory.Create(serviceUri, credentials);

            QueueClient sessionQueueClient = factory.CreateQueueClient(Receiver.QueueName);

            //Create sessionQueueClient and subscribe to SessionIDs that have a value of “Mike”
            MessageSession sessionReceiver = sessionQueueClient.AcceptMessageSession(“Mike”, TimeSpan.FromSeconds(60));
            BrokeredMessage receivedMessage;

                while ((receivedMessage = sessionReceiver.Receive(TimeSpan.FromSeconds(60))) != null)
                {
                    var data = receivedMessage.GetBody<PowerOut>(new DataContractSerializer(typeof(PowerOut)));
                    Console.WriteLine(String.Format(“Customer Name: {0}”, data.CustomerName));
                    Console.WriteLine(“SessionID: {0}”, receivedMessage.SessionId);
                    //remove message from Topic
                    receivedMessage.Complete();
                }
          
            Console.WriteLine(“All received on this session…press enter to exit”);
            Console.Read();
        }
    }
}

 

The code itself is very similar to that of some of my previous blog posts on ServiceBus integration.  The main difference is instantiating a MessionSession object.

MessageSession sessionReceiver = sessionQueueClient.AcceptMessageSession(“Mike”, TimeSpan.FromSeconds(60));

Within this line of we are indicating that we want to receive messages that belong to the Mike Session.  We can also provide a TimeSpan as an argument to specify the duration in which we want to receive from this Session.  Setting this value is more useful when we are looking for any available Session as it allows all messages within a Session to be processed before moving onto the next Session.

Testing

I have two sets of messages here.  Two of the messages will be routed through the Joe Send Port and subsequently the SessionID for these two messages will be set to Joe. The other two messages will be routed through the Mike Send Port and subsequently will have its SessionID property set to  Mike.

image

As mentioned previously, both Send Ports are configured to send to the same Service Bus Queue.  When we do run our client application, the expectation is that messages belonging to the Mike Session will be retrieved.  The Joe Messages will remain in the Queue until another receiver pulls them down or the the Time To Live (TTL) threshold has been exceeded.

When we start our Consumer application we do discover that the “Mike” messages are processed.

image

So what happened to the other messages?

The “Joe” messages are still in our Queue.  If we navigate to our Windows Azure Portal, we will discover that our Queue Length is set to 2.

image

So how do we get these messages out?

We have a couple options, we can create another MessageSession instance and retrieve all Messages belonging to the Mike Session or we can not specify a Session and our client will look for the next available Session which in this case will be the Mike Session.

Let’s go with the second option and retrieve the next available session.  In order to do so we need to change the following line of code from

  MessageSession sessionReceiver = sessionQueueClient.AcceptMessageSession(“Mike”, TimeSpan.FromSeconds(60));

to

  MessageSession sessionReceiver = sessionQueueClient.AcceptMessageSession(TimeSpan.FromSeconds(60));

We essentially are no longer specifying a specific Session that we are interested in and are now interested in any Session.

I will now process another 4 files; 2 will belong to the Joe Session and 2 will belong to the Mike Session.  What we expect to happen is that all 4 Joe messages will be processed since it is the next available Session.

image

So this proves that we have In Order Delivery occurring at the Session level.  Initially our Mike Session was processed which left our Joe messages outstanding.  We then loaded 4 more messages to the Queue and since the Joe messages were first in, they were processed first.  Our remaining 2 messages that now belong to Mike can be retrieved by starting up our Client Application once again.

image

Note:

Within my BizTalk Send Ports I statically configured my SessionID.  This isn’t very practical in the “Real World” but it was easy to demonstrate for the purpose of this blog post.  Much like other BizTalk context properties the SessionID property is available and can be set dynamically within an Orchestration Message Assignment shape or a Pipeline Component.

image

Conclusion

Overall I found this functionality pretty neat. I do think that it is another capability that we can leverage to support more granular control over message processing.  I do like the opportunity to group messages together and treat them as a batch.  This also works when dealing with message size limitations as we can stich a bunch of smaller messages together that collective make up a large message.

Hadoop + SSIS, SSIS + Windows Azure Blob Storage

I worked on a white paper which has just been published on MSDN J’ai travaill%u00e9 sur un livre blanc qui vient d’%u00eatre publi%u00e9 sur MSDN

Leveraging a Hadoop cluster from SQL Server Integration Services (SSIS)

I’d like to point out that the paper comes with sample code (thanks R%u00e9mi!) that can also be used besides Hadoop as it enables data movement to and from Windows Azure Blob storage from SQL Server’s ETL: SSIS. J’en profite pour pr%u00e9ciser que le livre blanc est fourni avec du code exemple (merci R%u00e9mi!) qui peut %u00eatre utilis%u00e9 ind%u00e9pendamment d’Hadoop puisqu’il permet de copier des donn%u00e9es vers et depuis les blobs Azure depuis l’ETL de SQL Server: SSIS.
The code samples are available at the following URLs: Les URLs des exemples de code sont les suivantes:

 

SSIS Packages Sample for Hadoop and Windows Azure

HadoopOnAzure REST API Wrapper Sample

 

Azure Blob Storage Components for SSIS Sample

 

Benjamin

Blog Post by: Benjamin GUINEBERTIERE

Microsoft Integration MVP

Normal
0

false
false
false

EN-AU
X-NONE
X-NONE

MicrosoftInternetExplorer4

/* Style Definitions */
table.MsoNormalTable
{mso-style-name:”Table Normal”;
mso-tstyle-rowband-size:0;
mso-tstyle-colband-size:0;
mso-style-noshow:yes;
mso-style-priority:99;
mso-style-parent:””;
mso-padding-alt:0cm 5.4pt 0cm 5.4pt;
mso-para-margin-top:0cm;
mso-para-margin-right:0cm;
mso-para-margin-bottom:10.0pt;
mso-para-margin-left:0cm;
line-height:115%;
mso-pagination:widow-orphan;
font-size:11.0pt;
font-family:”Calibri”,”sans-serif”;
mso-ascii-font-family:Calibri;
mso-ascii-theme-font:minor-latin;
mso-hansi-font-family:Calibri;
mso-hansi-theme-font:minor-latin;
mso-bidi-font-family:”Times New Roman”;
mso-bidi-theme-font:minor-bidi;
mso-fareast-language:EN-US;}

I am so pleased to announce the new MVP Category.
Microsoft Integration MVP
I have always been doing integration, for the last 12 years I’ve been doing integration, it’s just something about being able to reach out to other business systems and not just make them talk to each other make the connections between them dance. Lay down the right architecture and make it work.
The technology I used also varied, be it WCF, BizTalk,
Azure, SQL Azure, Azure XYZ. , SharePoint, SQL Server or anything else, when you do integration you
previously had to choose which MVP to be
Now thanks to Microsoft you don’t need to choose anymore. 
I could not be happier with the name as it correctly
reflects exactly what we do, I will write posts and articles on any of these
topics all under the banner of integration, and all on the Microsoft technology
stack. Bring it on I say.
DocumentSpecName usage in the pipeline configuration

DocumentSpecName usage in the pipeline configuration

A schema can be defined in the ‘DocumentSpecName’-attribute of the pipeline configuration properties (in the dissamble stage). When no schema is defined and there are multiple schema’s with the same combination for ‘Target Namespace’ (optional) and ‘Root Name’, you ‘ll get following error in the receive pipeline: There was a failure executing the receive pipeline: […]
Blog Post by: Cnext

Want to try BizTalk 2010 R2

Want to try BizTalk 2010 R2

For sure, I’m litlle bit late to talk about that, but you can test BizTalk 2010R2 ctp since start of september; Pre-requisite : you need to have a Windows azure account (if not you can create one for free during 90 days). First log on https://manage.windowsazure.com Go to Virtual Machine tab Clic on Create new […]
Blog Post by: Jeremy Ronk