Encoding an MP4 file to Smooth Streaming and Apple HLS in the cloud | Encodage en nuage d’un fichier MP4 vers du smooth streaming et de l’Apple HLS

Since June 2012, a preview of Windows Azure Media Services is available. This post provides sample code and execution screenshots of the following scenario: Encoding an MP4 file to Smooth Streaming and Apple HLS in the cloud. Depuis juin 2012, une version b%u00eata de Windows Azure Media Services est disponible. Ce billet fournit un exemple de code et des copies d’%u00e9crans pour le sc%u00e9nario suivant: Encodage en nuage d’un fichier MP4 vers du smooth streaming et de l’Apple HLS
You may find a lot of good resources on how to start with Windows Azure at http://www.windowsazure.com, and more specifically at https://www.windowsazure.com/en-us/home/scenarios/media/ for Windows Azure Media Services. Vous trouverez beaucoup de bonnes ressources sur le d%u00e9marrage avec Windows Azure %u00e0 http://www.windowsazure.com, et plus sp%u00e9cifiquement %u00e0 https://www.windowsazure.com/fr-fr/home/scenarios/media/ pour Windows Azure Media Services.
En fran%u00e7ais, il y a %u00e9galement ce billet pour bien d%u00e9marrer.
Windows Azure Media Services SDK documentation is available in the MSDN library. La documentation du SDK de Windows Azure Media Services est disponible dans la librairie MSDN.

 

After following the first steps of https://www.windowsazure.com/en-us/develop/net/how-to-guides/media-services/, we get a machine with Visual Studio 2010, Windows Azure Media Services SDK, and Windows Azure SDK 1.6 (Windows Azure Media Services preview does not work yet with Windows Azure SDK 1.7 yet). At this stage, we also have a Windows Azure Media Services account and the corresponding key. Apr%u00e8s avoir suivi les premi%u00e8res %u00e9tapes de https://www.windowsazure.com/en-us/develop/net/how-to-guides/media-services/, on a une machine avec Visual Studio 2010, le SDK Windows Azure Media Services, et Windows Azure SDK 1.6 (cette version de Windows Azure Media Services ne fonctionne pas encore avec le SDK 1.7 de Windows Azure). A ce stade, on a %u00e9galement un compte Windows Azure Media Services et sa clef.

 

Let’s start a console App running on .NET Framework 4 Commen%u00e7ons une application console qui s’ex%u00e9cute sur le .NET Framework 4.0
with the following references: avec les r%u00e9f%u00e9rences suivantes:

 

Add some code Ajoutons du code

 

 

using System;

using System.Collections.Generic;

using System.IO;

using System.Linq;

using System.Text;

using Microsoft.WindowsAzure.MediaServices.Client;

 

namespace WindowsAzureMediaServicesSample

{

    class Program

    {

        [STAThread]

        static void Main(string[] args)

        {

            try

            {

                Console.WriteLine("retrieving context");

                CloudMediaContext context = new CloudMediaContext(Configuration.AccountName, Configuration.AccountKey);

                Console.WriteLine("context is retrieved");

 

                //(new SampleCode2(context)).Show();

                //(new SampleCode2(context)).Reset();

                //(new SampleCode2(context)).Show();

                (new SampleCode5(context)).Run();

            }

            catch (Exception ex)

            {

                Console.WriteLine("Oops: {0}", ex);

            }

            finally

            {

                Console.WriteLine("---");

                Console.ReadLine();

            }

        }

 

    }

}

 

SampleCode2:

using System;

using System.Collections.Generic;

using System.Linq;

using System.Text;

using Microsoft.WindowsAzure.MediaServices.Client;

 

namespace WindowsAzureMediaServicesSample

{

    public class SampleCode2

    {

        private CloudMediaContext context = null;

        public SampleCode2(CloudMediaContext context) 

        { 

            this.context = context;

            this.context.Assets.OnUploadProgress += Assets_OnUploadProgress;

        }

 

        public void Show()

        {

            Console.WriteLine("--- media processors ---");

            foreach (var mp in context.MediaProcessors)

            {

                Console.WriteLine("name={0}, Vendor={1}, version={2}", mp.Name, mp.Vendor, mp.Version);

            }

 

            Console.WriteLine();

 

 

            foreach (var job in context.Jobs)

            {

                Console.WriteLine("{0} {1} {2} {3} {4}", job.Name, job.State, job.Created, job.EndTime, job.RunningDuration);

                foreach (var t in job.Tasks)

                {

                    Console.WriteLine("{0} {1}", t.Name, t.State);

                    foreach (var d in t.ErrorDetails)

                    {

                        Console.WriteLine("\t{0}\t{1}", d.Code, d.Message);

                    }

                }

                Console.WriteLine();

            }

 

            foreach (var p in context.AccessPolicies)

            {

                Console.WriteLine("policy Id={0}, name={1}, modified={2}, permissions={3}, duration={4}"

                    , p.Id, p.Name, p.LastModified, p.Permissions, p.Duration);

            }

 

            foreach (var l in context.Locators)

            {

                Console.WriteLine("Id={0}, Path={1}, expires={2}", l.Id, l.Path, l.ExpirationDateTime);

                var p = l.AccessPolicy;

                Console.WriteLine("policy Id={0}, name={1}, modified={2}, permissions={3}, duration={4}"

                    , p.Id, p.Name, p.LastModified, p.Permissions, p.Duration);

            }

 

        }

 

        public void Reset()

        {

            Console.WriteLine("will reset");

 

            foreach (var job in context.Jobs)

            {

                job.Delete();

            }

 

            foreach (var l in context.Locators)

            {

                context.Locators.Revoke(l);

            }

 

            foreach (var p in context.AccessPolicies)

            {

                context.AccessPolicies.Delete(p);

            }

 

            foreach (var a in context.Assets)

            {

                context.Assets.Delete(a);

            }

        }

 

        private void Assets_OnUploadProgress(object sender, UploadProgressEventArgs e)

        {

            Console.WriteLine("Assets_OnUploadProgress: {0:0.00} %, {1} / {2}, {3:0.00} MB / {4:0.00} MB",

                e.Progress, e.CurrentFile, e.TotalFiles, e.BytesSent / 1024 ^ 2, e.TotalBytes / 1024 ^ 2);

        }

    }

}

 

HLSConfiguration.xml:

<?xml version="1.0" encoding="utf-8" ?> 

<!-- cf http://msdn.microsoft.com/en-us/library/hh973636.aspx -->

<taskDefinition xmlns="http://schemas.microsoft.com/iis/media/v4/TM/TaskDefinition#">

  <name>Smooth Streams to Apple HTTP Live Streams</name>

  <id>A72D7A5D-3022-45f2-89B4-1DDC5457C111</id>

  <description xml:lang="en">Converts on-demand Smooth Streams encoded with H.264 (AVC) video and AAC-LC audio codecs to Apple HTTP Live Streams (MPEG-2 TS) and creates an Apple HTTP Live Streaming playlist (.m3u8) file for the converted presentation.</description>

  <inputDirectory></inputDirectory>

  <outputFolder>TS_Out</outputFolder>

  <properties namespace="http://schemas.microsoft.com/iis/media/AppleHTTP#" prefix="hls">

    <property name="maxbitrate" required="true" value="1600000" helpText="The maximum bit rate, in bits per second (bps), to be converted to MPEG-2 TS. On-demand Smooth Streams at or below this value are converted to MPEG-2 TS segments. Smooth Streams above this value are not converted. Most Apple devices can play media encoded at bit rates up to 1,600 Kbps."/>

    <property name="manifest" required="false" value="" helpText="The file name to use for the converted Apple HTTP Live Streaming playlist file (a file with an .m3u8 file name extension). If no value is specified, the following default value is used: &lt;ISM_file_name&gt;-m3u8-aapl.m3u8"/>

    <property name="segment" required="false" value="10" helpText="The duration of each MPEG-2 TS segment, in seconds. 10 seconds is the Apple-recommended setting for most Apple mobile digital devices."/>

    <property name="log"  required="false" value="" helpText="The file name to use for a log file (with a .log file name extension) that records the conversion activity. If you specify a log file name, the file is stored in the task output folder." />

    <property name="encrypt"  required="false" value="false" helpText="Enables encryption of MPEG-2 TS segments by using the Advanced Encryption Standard (AES) with a 128-bit key (AES-128)." />

    <property name="pid"  required="false" value="" helpText="The program ID of the MPEG-2 TS presentation. Different encodings of MPEG-2 TS streams in the same presentation use the same program ID so that clients can easily switch between bit rates." />

    <property name="codecs"  required="false" value="false" helpText="Enables codec format identifiers, as defined by RFC 4281, to be included in the Apple HTTP Live Streaming playlist (.m3u8) file." />

    <property name="backwardcompatible"  required="false" value="false" helpText="Enables playback of the MPEG-2 TS presentation on devices that use the Apple iOS 3.0 mobile operating system." />

    <property name="allowcaching"  required="false" value="true" helpText="Enables the MPEG-2 TS segments to be cached on Apple devices for later playback." />

    <property name="passphrase"  required="false" value="" helpText="A passphrase that is used to generate the content key identifier." />

    <property name="key"  required="false" value="" helpText="The hexadecimal representation of the 16-octet content key value that is used for encryption." />

    <property name="keyuri"  required="false" value="" helpText="An alternate URI to be used by clients for downloading the key file. If no value is specified, it is assumed that the Live Smooth Streaming publishing point provides the key file." />

    <property name="overwrite"  required="false" value="true" helpText="Enables existing files in the output folder to be overwritten if converted output files have identical file names." />

  </properties>

  <taskCode>

    <type>Microsoft.Web.Media.TransformManager.SmoothToHLS.SmoothToHLSTask, Microsoft.Web.Media.TransformManager.SmoothToHLS, Version=1.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35</type>

  </taskCode>

</taskDefinition>

 

SampleCode5:

using System;

using System.Collections.Generic;

using System.Linq;

using System.Text;

using Microsoft.WindowsAzure.MediaServices.Client;

using System.IO;

using System.Windows.Forms;

 

namespace WindowsAzureMediaServicesSample

{

    public class SampleCode5

    {

        private System.Timers.Timer jobTimer;

        private string runningJobId = null;

        private bool jobCompleted = false;

        private IJob job;

        private CloudMediaContext context = null;

        public SampleCode5(CloudMediaContext context) 

        { 

            this.context = context;

            this.context.Assets.OnUploadProgress += Assets_OnUploadProgress;

        }

 

        public void Run()

        {

            #region upload or find MP4 initial asset

            IAsset asset = null;

            if (!Configuration.SampleFile3AlreadyUploaded)

            {

                asset = context.Assets.Create(Configuration.SampleFile3LocalPath, AssetCreationOptions.None); // an MP4 file

            }

            else

            {

                foreach (var a in context.Assets)

                {

                    foreach (var f in a.Files)

                    {

                        if (string.Compare(f.Name, Configuration.SampleFile3, false) == 0)

                        {

                            asset = a;

                            break;

                        }

                    }

                    if (asset != null) break;

                }

            }

            #endregion

 

            #region submit or find job 1: MP4 => Smooth Streaming

            job = context.Jobs.Where(j => j.Name == Configuration.JobNameSampleCode5).SingleOrDefault<IJob>();

            if (job == null)

            {

                Console.WriteLine("submitting job");

 

                var q = from p in context.MediaProcessors

                        where p.Name == "Windows Azure Media Encoder"

                        select p;

 

                IMediaProcessor processor = q.FirstOrDefault<IMediaProcessor>();

                if (processor == null)

                {

                    throw new ApplicationException("media processor for smooth streaming not found");

                }

 

                job = context.Jobs.Create(Configuration.JobNameSampleCode5);

 

                ITask task = job.Tasks.AddNew("sample encoding task",

                    processor,

                    "H.264 IIS Smooth Streaming - HD 720p CBR" /* cf http://msdn.microsoft.com/en-us/library/jj129582.aspx */,

                    TaskCreationOptions.None);

                task.InputMediaAssets.Add(asset);

                IAsset task1OutputAsset = task.OutputMediaAssets.AddNew(

                    "Smooth Streaming Output", true, AssetCreationOptions.None);

 

 

                var q2 = from p in context.MediaProcessors

                         where p.Name == "Smooth Streams to HLS Task"

                         select p;

 

                IMediaProcessor processor2 = q2.FirstOrDefault<IMediaProcessor>();

                if (processor2 == null)

                {

                    throw new ApplicationException("media processor for HLS task not found");

                }

 

                string hlsConfiguration = File.ReadAllText("HLSConfiguration.xml");

                ITask task2 = job.Tasks.AddNew("sample HLS task 2",

                    processor2, hlsConfiguration,

                    TaskCreationOptions.None);

                task2.InputMediaAssets.Add(task1OutputAsset); // start from output assets of previous task

                task2.OutputMediaAssets.AddNew("HLS Output", true, AssetCreationOptions.None);

 

                job.Submit();

            }

            #endregion

 

            WaitForJob();

 

 

            // If the job completes, have the files available for smooth streaming

            if (jobCompleted)

            {

                Console.WriteLine("publishing smooth streaming and HLS results");

                IAccessPolicy streamingPolicy = context.AccessPolicies.Create("Streaming policy",

                    TimeSpan.FromDays(5), AccessPermissions.Read);

 

                foreach (var t in job.Tasks)

                {

                    foreach (var outputAsset in t.OutputMediaAssets)

                    {

                        Console.WriteLine("output asset: {0} has {1} file(s)", outputAsset.Name, outputAsset.Files.Count);

                        foreach (var f in outputAsset.Files.Where(x => x.Name.EndsWith(".ism")))

                        {

                            if (f.Name.Contains("m3u8"))

                            {

                                #region publish HLS to a WAMS origin

                                Console.WriteLine("will create a new locator for HLS");

 

                                IFileInfo manifestFile = f;

                                ILocator originLocator = context.Locators.CreateOriginLocator(

                                    outputAsset, streamingPolicy, DateTime.UtcNow.AddMinutes(-5));

 

                                string urlForClientStreaming = originLocator.Path + manifestFile.Name

                                    + "/manifest(format=m3u8-aapl)";

 

                                Console.WriteLine("URL to manifest for client HSL streaming: ");

                                Console.WriteLine(urlForClientStreaming);

                                Clipboard.SetText(urlForClientStreaming);

                                Console.WriteLine("---");

                                Console.ReadLine();

                                #endregion

                            }

                            else

                            {

                                #region publish smooth streaming to a WAMS origin

 

                                IFileInfo manifestFile = f;

                                ILocator originLocator = context.Locators.CreateOriginLocator(

                                    outputAsset, streamingPolicy, DateTime.UtcNow.AddMinutes(-5));

 

                                string urlForClientStreaming = originLocator.Path + manifestFile.Name + "/manifest";

                                Console.WriteLine("URL to manifest for client smooth streaming: ");

                                Console.WriteLine(urlForClientStreaming);

                                Clipboard.SetText(urlForClientStreaming);

                                Console.WriteLine("---");

                                Console.ReadLine();

                                #endregion

 

                                #region download smooth streaming files locally

                                //string localFileName = Path.Combine(Configuration.OutputFolder, f.Name);

                                //Console.WriteLine("Asset {0}, downloading to {1}", outputAsset.Id, localFileName);

                                //f.OnDownloadProgress += new EventHandler<DownloadProgressEventArgs>(f_OnDownloadProgress);

                                //f.DownloadToFile(localFileName);

                                #endregion

                            }

                        }

                    }

                }

            }

            else

            {

                Console.WriteLine("Please check job again later.");

            }

        }

 

        private void WaitForJob()

        {

            runningJobId = job.Id;

 

            if (job.State == JobState.Finished)

            {

                jobCompleted = true;

                Console.WriteLine("");

                Console.WriteLine("********************");

                Console.WriteLine("Job state is: " + job.State + ".");

                foreach (var t in job.Tasks)

                {

                    Console.WriteLine("task {0} state={1} duration={2}, PerfMessage={3}",

                        t.Name, t.State, t.RunningDuration, t.PerfMessage);

                }

                Console.WriteLine("Job completed successfully.");

 

                return;

            }

 

            // Expected polling interval in milliseconds.  Adjust this 

            // interval as needed based on estimated job completion times.

            const int JobProgressInterval = 10000;

 

            // Create a timer with the specified interval, and an event

            // to check job progress.  This is an optional workaround 

            // because job progress checking is not currently implemented. 

            this.jobTimer = new System.Timers.Timer(JobProgressInterval);

            // Hook up an event handler.

            jobTimer.Elapsed += new System.Timers.ElapsedEventHandler(jobTimer_Elapsed);

 

            jobTimer.Start();

 

            Console.WriteLine("Please wait, checking job status...");

            // Wait for timer to elapse.

            Console.ReadLine();

            // After the job progress event, stop timer. 

            jobTimer.Stop();

 

            // Refresh the reference to the job object. 

            context.Detach(job);

            job = context.Jobs.Where(j => j.Id == runningJobId).SingleOrDefault();

        }

 

        void f_OnDownloadProgress(object sender, DownloadProgressEventArgs e)

        {

            Console.WriteLine("Download progress: {0:0.00} %, {1:0.00} MB / {2:0.00} MB",

                e.Progress, Convert.ToDouble(e.BytesDownloaded) / (1024 * 1024), Convert.ToDouble(e.TotalBytes) / (1024 * 1024));

        }

 

        void jobTimer_Elapsed(object sender, System.Timers.ElapsedEventArgs e)

        {

            //Get a refreshed job reference each time the event fires. 

            IJob theJob = context.Jobs.Where(j => j.Id ==

                runningJobId).SingleOrDefault();

 

            // Check and handle various possible job states.

            switch (theJob.State)

            {

                case JobState.Finished:

                    jobCompleted = true;

                    Console.WriteLine("");

                    Console.WriteLine("********************");

                    Console.WriteLine("Job state is: " + theJob.State + ".");

                    foreach (var t in theJob.Tasks)

                    {

                        Console.WriteLine("task {0} state={1} duration={2}, PerfMessage={3}",

                            t.Name, t.State, t.RunningDuration, t.PerfMessage);

                    }

                    Console.WriteLine("Job completed successfully.");

                    Console.WriteLine("Press Enter to complete the job.");

                    jobTimer.Stop();

                    break;

                case JobState.Queued:

                case JobState.Scheduled:

                case JobState.Processing:

                    Console.WriteLine("Job state is: " + theJob.State + ".");

                    Console.WriteLine("Continue waiting for the job to complete, or " +

                        "press Enter in the console to exit without waiting.");

                    break;

                case JobState.Error:

                    Console.WriteLine("Error:");

                    foreach (var t in theJob.Tasks)

                    {

                        Console.WriteLine("task {0} state={1}", t.Name, t.State);

                        Console.WriteLine("\tdetails:");

                        foreach (var d in t.ErrorDetails)

                        {

                            Console.WriteLine("\t{0}\t{1}", d.Code, d.Message);

                        }

                        Console.WriteLine("---");

                        //Console.WriteLine("task body: '{0}'", t.TaskBody);

                    }

                    jobTimer.Stop();

 

                    break;

                default:

                    Console.WriteLine(theJob.State.ToString());

                    break;

            }

 

            // Detach the job to prevent the reference going stale.

            context.Detach(theJob);

        }

 

        void Assets_OnUploadProgress(object sender, UploadProgressEventArgs e)

        {

            Console.WriteLine("Assets_OnUploadProgress: {0:0.00} %, {1} / {2}, {3:0.00} MB / {4:0.00} MB",

                e.Progress, e.CurrentFile, e.TotalFiles,

                Convert.ToDouble(e.BytesSent) / (1024 * 1024), Convert.ToDouble(e.TotalBytes / (1024 * 1024)));

        }

    }

}

You can download the C# and XML files from http://sdrv.ms/NsNMYX Vous pouvez t%u00e9l%u00e9charger le code C# et XML depuis http://sdrv.ms/NsNMYX
Let’s execute it Ex%u00e9cutons-le

A few containers were created in the Windows Azure blob storage account. One of these is the source asset: quelques conteneurs ont %u00e9t%u00e9 cr%u00e9%u00e9s dans le compte de stockage blobs Windows Azure. L’un d’eux est l’asset source:
Note that most of them are intermediary assets that will be destroyed when the job is finished. On notera que la plupart de ces conteneurs sont temporaires et seront d%u00e9truits quand le job sera termin%u00e9.

Blob storage now looks like this: Le compte de stockage ressemble alors %u00e0 cela:

Paste the URL in a HTML file that contains the Smooth Streaming player Collons l’URL dans le fichier HTML qui contient le player smooth streaming

paste collons
From a browser, you get the video (which also happens to talk about Windows Azure by the way!) Depuis un navigateur, on a la vid%u00e9o (qui se trouve %u00eatre sur Windows Azure aussi!)
I don’t have a screenshot of the same video on an iPhone from the smooth.html page, but I tested it also. Je n’ai pas de copie d’%u00e9cran de la m%u00eame vid%u00e9o jou%u00e9e depuis un iPhone %u00e0 partir de la page smooth.html, mais je l’ai aussi test%u00e9.

 

The Web site code and the XAP file can be downloaded from http://sdrv.ms/NbUyEe Le code du site Web et le fichier XAP peuvent %u00eatre t%u00e9l%u00e9charg%u00e9s depuis http://sdrv.ms/NbUyEe

 

Benjamin

Blog Post by: Benjamin GUINEBERTIERE

Setting custom SOAP headers in the WCF adapter

While browsing for the answer to the question: “How do I add SOAP-headers to a message sent using the WCF-custom or WCF-basicHttp adapter?” I never really found a good, short answer. So I thought I’d give it a go.

Setting SOAP headers

I assume you know what SOAP-headers are and why you might use them. If not, then back to basics.

In my case the client needed BizTalk to send request with the WS-addressing SOAP header called “To”. I needed to know the easiest way to do this and preferably using configuration and no orchestrations.

To the best of my knowledge, this is the simplest way to do it.

Using a pipeline

Use a pipeline component to promote this property: http://schemas.microsoft.com/BizTalk/2006/01/Adapters/WCF-properties#OutboundCustomHeaders.

My guess is that you local BizTalk-code hub already have a pipeline component to promote arbitrary values. If you do not, the code for promoting the property is here.

The only thing to remember is that the value of the property is a bit special. You can hard code the values of your headers, even using xml-formatting; no problem, but you have to surround the value with a <headers> tag.

<headers>
<h:To xmlns:h="http://www.w3.org/2005/08/addressing">rcvAddr</h:To>
</headers>

This will result in the WCF adapter serializing a SOAP envelope with SOAP headers that contains the value you give between the <headers> tag.

Here is the result in my environment:

Using an orchestration

This is a bit more work, but a very useful way to get the same result of you already have an orchestration. A bit more information can be found here.

What you basically do is setting the property from an assignment shape, much like you would access a FILE.RecieveFileName.

outboundMessageInstance(WCF.OutboundCustomHeaders) =
"<headers><add:To xmlns:add="http://www.w3.org/2005/08/addressing">rcvAddr</h:To></headers>"

Blog Post by: Mikael Sand

BizTalk Community series: Introducing Lex Hegt

Before I will head to sunny Spain for a nice vacation I will have one more story for you on a hard working BizTalk community member Lex Hegt. He is a former colleague of mine that works for Ordina. Lex is the founder of the recent launched website BizTalk Events. A good initiative as you have a great access point to resources (slides, links, speakers, etc.) from numerous BizTalk events. The site has a calendar that shows, past and upcoming future events.

Lex lives in Leidschendam, close to The Hague with his wife Odette and children (9 year old daughter Denise and almost 7 year old son Casper).He is currently stationed at the Dutch Ministry of Foreign Affairs, where he works in the role of BizTalk Administrator and Architect. On the combination  of roles Lex says:

“This might sound like a strange combination, but to me it makes perfect sense! A BizTalk Admin needs to be pretty good skilled to be able to do his job. That combined with BizTalk developing experience, you have a lot of the technical skills needed to be an architect!”

Lex has worked in the IT since 1985 and in the early years had functions like Helpdesk employee, Systems Administrator, Application Engineer, Developer and Functional Designer. Integration of information systems has always intrigued him and his first (serious) integration solution was, in the mid-nineties, between an IBM mainframe and WordPerfect 5.1, enabling secretaries to retrieve patient information from the IBM mainframe and merge this with letters for family doctors.
8 Years ago, when Lex started to work for his current employer Ordina, he could choose with which Microsoft technology he liked to work. Because of his interested in BizTalk since its betas (2000), he chose for that product, a choice he has never regretted since that day.

Lex on the beginning of his career:

“In my early days in IT, when Internet was not as widely available as it is now, I already wanted to share my experience and be innovative. To challenge myself to further develop myself and just because it is fun, I shared my experience with my direct colleagues.”

“When I started with BizTalk, I worked as closely as possible with my more experienced colleagues like Randal van Splunteren (who’s now a MVP) and Isaac Ferreira. They left Ordina in 2006 and I inherited the BIA-blog they had initiated. So since that time, I am blogging my experiences there. At that time I also tried answering questions on BizTalkGurus. It is also since that time that I know Steef-Jan and other people from who many are still active in the BizTalk Community.”

More recently Lex started writing blog posts on BizTalkAdminsBlogging, which is initiated by Jeroen Hendriks (Axon Olympus). On this blog around 10 authors blog about topics which target BizTalk Admins. This blog, other blogs and the many events with presentations on administrative topics, makes sure that the BizTalk Admins are heard. This is important, because with the specific experience they have. You need to have them involved during the design, development and deployment preparation phase of a BizTalk project. The goal is off course to create better BizTalk solutions all together!

Even more recent Lex has started a website called BizTalk Events. With this website he wants to make BizTalk related events, mainly from BizTalk and Connected Systems User Groups, around the globe more visible by showing them in a calendar. Afterwards and when available, he provides links to downloadable content (slide decks/demos) from the events and/or (links to) reviews of the events. This way Lex hopes to give the events a bit more attention resulting in more wide-spread knowledge of BizTalk!
Lex comments on his site:
“To me the BizTalk Events site has shown very clear that the rumor that BizTalk is dead, is far from the truth! During the period May until September 2012 there are now 18 around the world events registered on BizTalk Events! Let’s say that there is an average of 30 attendants per event. That would be mean that around 540 attendants around the globe learning a lot about different aspects of BizTalk !”

Lex has a strong opinion on BizTalk Server:

“Since its early releases BizTalk is evolved to a mature product with support for many protocols and Lines of Business. Also the Developer experience and the out-of-the-box capabilities for Administrators have been improved amazingly, since those early releases. The most intriguing capability from BizTalk however, actually the reason of its existence, is that it enables organizations to connect their internal systems and/or external parties through BizTalk, which leads to better streamlined business processes, resulting in more efficient organizations, achieving their goals and serving their customers. From day 1 this is my motivation to work with BizTalk and why I like to work with BizTalk so much! I simply can’t get enough of seeing these 1000’s of messages work their way through BizTalk, supporting the business process, the organization and the customer!”

Lex is a busy bee, however in his spare time he likes play field hockey, doing a lot of stuff on his computer, stargazing, having a whisky (preferable single malt), drink wine (Gutturnio is my latest favorite) and having dinner with great food, wine and company. He also enjoys the rare moments that he and his wife have, when the kids don’t sleep at home. They love to have a nice dinner for two and grab a movie.

Finally Lex would like to thank all the readers of his blog posts, people that follow the events on BizTalkEvents and jis followers on Twitter! And to Steef-Jan:

“It is unbelievable how much valuable content you have generated! You have been and still are a great example to me!”

One final statement from Lex:

“Using the immense content generated by the BizTalk Community for your own good is smart, but sharing your own experience through participation in the BizTalk Community is even better!”

Thanks Lex for your time and contributions for the BizTalk community. I enjoyed working together with you at Ordina and some of the events that were organized for BizTalk Server.

Cheers,

– Steef-Jan

Troubleshooting : ’This type of page not served’

Recently when deploying a site to IIS I got the following error.

I tried several troubleshooting options such as checking the IIS mapping configuration to see if the .aspx pages were an accepted page extension, folder permissions on the deployment folder,  and I knew for a fact that .net was correctly installed on my server.
After some […]
Blog Post by: Karl Schwirz

No Results Showing Up in QRServer Queries in FAST Search Server 2010 for SharePoint

Overview
FAST Search Server 2010 comes with a web application for executing FQL queries against the FAST index, called QRServer.  One of our clients was having an issue executing FQL queries, only getting data back that was being crawled by the FAST Web Crawler, and none of the SharePoint data being crawled by the FAST Content […]
Blog Post by: Michael Gerety

BizTalk and empty files

There are a lot of things I do not know about BizTalk. The list is getting shorter but here is something I found.

I was trying to verify a flow within a known environment. Everything else seemed to work apart from this one flow. A technician submitted files to a directory and the file was picked up. However it did not show up on any tracking; neither the basic BizTalk tracking nor our BAM-implementation noticed the file.

The files was picked up and I verified that the file was picked up by BizTalk. I could not submit the file myself as I did not have access to the path.

After a while I remembered to check to log on the other BizTalk node in the cluster and then it became clear. A simple warning said: “The FILE receive adapter deleted the empty file "\\Server\Share\testfile.txt" without performing any processing.” I have to admit that I did not know that. It is actually a “know issue

What happens is that the file is picked up but as the technician just submitted files using the old Right-click + New the file is empty. BizTalk does not process empty streams as it were and the file is deleted without any trace in the tracking.

Here’s a tip

In some scenarios you might receive an empty file to start a flow within BizTalk. Perhaps some system is telling BizTalk “That data you’re so interested in is done”. Make sure that file contains some data. Perhaps just a repeat of the file name or the letter “M”.

Blog Post by: Mikael Sand

AAP25 Why we fail: An architect’s journey to the private cloud

This presentation is not present at Channel9 at the moment. That is a shame because this was, in my opinion, the best presentation of the whole conference.

The session was presented by Alex Jauch, currently at NetApp but he used to work for Microsoft. Actually he was behind the precursor that became the MCA. I had never even heard about this guy before and I would say that it is a shame. I have now though.

The heading for the session seem ominous and deterministic but given my personal experience I would say that it is not far from the truth to simply assume that “cloudification” will fail. Incidentally it is also the title of Alex’ book 🙂

Alex (or should I say Mr. Jauch?) started the session by clearly stating that he was about to say things that not all of us would agree upon. He would also try to upset some people! Bold and funny in my opinion.

Definition

The, or even a, definition for what cloud computing really is, can be hard to come by and one definition might differ a lot from the next. Alex presented the definition made by NIST. He pointed to the fact that NIST is a governmental agency and these are notorious for not agreeing on anything. The fact that they have agreed on a definition for cloud computing gives some credibility to it.

According to them there are five essential characteristics that together form a cloud. If any of these are left out, you are not a true cloud provider. They are:

On-demand self-service. A consumer should be able to change provisioning in the cloud by him/herself.

Broad network access. Capabilities are available over the network and accessed through standard mechanisms.

Resource pooling. The provider’s computing resources are pooled to serve multiple consumers using a multi-tenant model.

Rapid elasticity. Capabilities can be elastically provisioned and released, in some cases automatically, to scale rapidly outward and inward commensurate with demand.

Measured service. Resource usage can be monitored, controlled, and reported, providing transparency for both the provider and consumer of the utilized service.

So if your cloud does not have a portal for adding resources in a way that the consumer can do it, you do not have a cloud service.

The full definition (2 pages) can be found here.

So why do we fail?

I say that it comes down to this comparative table

Traditional IT Customer Centric IT (Cloud)
Sets IT standards Supports business requirements
Focus on operations excellence Focus on customer satisfaction
Engineering is key skillset Consulting is key skillset
Sets policy Seek input
Focus on large projects Focus on smaller projects
Organized by technology Organized by customer
Technology focus Business value focus
Delivers most projects in house Brokers external vendors as needed

It is not around technology we fail. It is in how we use it and the attitudes in those that implement the technology. When trying to run a cloud service as “we always have”, in a traditional manner that is when we fail.

In order to run a successful a successful cloud shop, we must change focus and really (am he means really) focus on the customer. A very telling quote from the session was around the focus on operations vs. focus on customer.

“’We had a 100% uptime last month’ What does that mean if the customer still has not manage to sell anything?”

So if someone is telling you: "We sell cloud”, at least ask them about the 5 points from the NIST definition.

Blog Post by: Mikael Sand