Local Windows Azure: Integrate, Innovate & Australia just got smarter

Well folks I’ve been greeted with the news that Microsoft Windows Azure will
be in 2 geo-replicated places here on Australian soil,
coming ’shortly’.

As an Azure MVP & from Breeze (a
leading Microsoft Cloud Partner) perspective we invest heavily in cloud technologies.

What does this mean and why should I care? I hear you ask good question
and I asked the same.

As most of you know I have a passion for Integration, sticking all sorts of things
together from small RFID devices, hand made hand-held devices, raspberry PIs through
to high end ERP, Financials & many other types of systems. So before I get to
the WHY aspect, let me briefly set the context.

There’s some great data coming out of Gartner a report which caught my eye – http://searchsoa.techtarget.com/news/2240173583/Gartner-Better-collaboration-for-new-era-of-application-integration came
out with these:

  • Integration Costs to rise by 33% by 2016,
    more than half of new system development costs will be spent on Integration

  • By 2017, over two-thirds of all new integration
    flows will extend outside the enterprise firewall.

So Integration just took on a whole new face – successful integration is about
using the right tools (in the toolbox) for the right task.
Now we have a
whole new drawer in our toolbox full of Azure goodies & widgets. This functionality
is just too compelling to be ignored.

and now that it’s on Australian soil I’d be thinking that just about every Data center
service provider should be giving you cloud functionality.

Some quick cloud advantages:

  • scale, provisioning and ease of use

    • Imagine being able to spin up a SharePoint site in the time it takes me to write this
      article.

    • Imagine being able to ask for an extra load balanced highly available Server/Service
      at the click of a button. Importantly – Imagine being able to give it back again at
      the end of the weekend/day/next hour.

  • Not wait the typical 12 weeks for a new server to be provisioned, oh and dont mention
    filling out the right forms. Running an application on those machines and getting
    a firewall port opened.that’ll be another 2 weeksand on it goes.

  • The much beloved Enlightenment for many companies of achieving Single Sign-On – Imagine
    your customers being able to sign into your applications using their own Ids, Live
    Ids, + a bunch of other Ids without you needing to provision more services. You can
    house your identity accounts in Azure, locally or elsewhere – finally you don’t need
    a Quantum Analyst to setup Single Sign-on.

  • My experiences in the last few weeks on client sites have been back in the world of
    old – classic encumbered infrastructure service providers wanting to claim everything,
    put the brakes on any new ideas and have meetings around such concepts of adding an
    extra 10gb disk space to existing servers. These guys should be ’can do’ people –
    it’s all about choosing the right tool for the job.

  • Microsoft have done a great job on the developer tooling front from the classic MS
    toolset through to Apple, PHP, Ruby, Phython etc. all being able to access, develop
    on, publish and deploy.

  • We could even give a bunch of HDD drives to Olaf (our gun cyclist @ Breeze) to ride
    to the Azure Data Center and offload our data, while we wait for the NBN to never
    come to our area.

  • There are some great options on the horizon coming down the track.

So let’s say we’re keen to explore – how hard/easy is it to get ’my’ own environment
& what does this mean.

The short answer is you get an Azure Footprint which could be running in a ’Data Center’
in Sydney. Depending on what you’re playing with you could get:

– SQL Databases, Cloud Services, Scalable Mobile Device Services, Load balanced Websites/Services/Restful
endpointsand the list of ’widgets’ goes on and on.

How do I interact with this environment:

Often the issue around alot of this is that because my beloved ’servers’ are running
somewhere else I’m concerned over how much control we get.

We enter into the Hybrid Integration space – where as you can imagine
not *everything* is suited for the Cloud, there will be things you keep exactly as
they are. So there will be many many scenarios where – we have something running locally
as well as something running in Azure. Some options we have available are to make
our servers ’feel at home’:

  • VPN connection – we can have several flavours of a VPN connection
    that connect our Azure Footprint to our local network. for e.g. local
    network is 10.10.x.x/16, Azure network 10.50.x.x/16. Full access to all the machines/services
    and other things you have running. CRON jobs, FTP, scripts, processes, linux boxes,
    samba shares, etc etc.. (I do realise the integration world is never as easy as we
    see it in the magazines)

  • RDP Connections – standard level of service really from any Service
    provider.

  • Remote PowerShell Access

  • Azure Service Bus – Applications Level Web/WCF/Restful Services connectivity.
    An Application Service can run either locally or in the cloud and this feature allows
    your Service to be accessed through a consistent Endpoint within the cloud, but the
    calls are Relayed down to your Application Service. There’s a few different ways we
    can ’relay’ but the public endpoint could house all the clients & their device
    requests, while your existing application infrastructure remains unchanged.

  • SQL Azure Data Sync – sync data between clouds & local from your
    databases. So for many clients, come 8pm each day, their local database has all the
    Orders for the day as per normal, without the usual provisioning headaches as the
    business responds to new market opportunities to support smart devices.

  • We even get pretty graphs.

    • But wait there’s more..

    • These details are typical performance monitor counters + diagnostic information. We
      can use Azure Admin tools to import these regularly and import them into our typical
      tools.

    • System Center does exactly this – so our ’dashboard’ of machines will list our local
      machines as well as our cloud machines. Your IT guys have visibility into what’s going
      on.

We’ve been using Singapore DCs or West Coast US with pretty good performance times
across the infrastructure. 

What does having a local Windows Azure Data Center mean to me:

  • Medical Industry – we have several medical clients allowing us to
    innovate around Cloud technologies using HL7 transports. Faster time to market and
    higher degrees of re-use.

  • Cloud Lab Manager – www.cloudlabmanager.com can
    run locally for all training providers. Breeze has created an award winning cloud
    based application that will certainly benefit from this piece of great news.

  • Creating a cloud based application is now feasible (this particular
    one was due to the sensitive nature of information it carried)

  • And lastly I can house my MineCraft server – well it’s my 10 yr old
    sons and half the school I reckon.

 

So for you

Ask yourself the question – are you getting all these features from where you currently
host/run your hardware?

Lack of infrastructure and provisioning challenges shouldn’t be holding back new ideas
& business movement. iPads, smartphones, anywhere, any time access should be the
norm, not like we’re putting another person on the moon.

It’s all about using the right tool for the job

Enjoy folks as it’s certainly exciting times for us Aussies ahead!!

Microsoft’s
Announcement

Blog Post by: Mick Badran

BizTalk 2013 Installation and Configuration – Configure SQL Server Network Configuration protocols (Part 10)

BizTalk 2013 Installation and Configuration – Configure SQL Server Network Configuration protocols (Part 10)

Under certain stress conditions (such as clients accessing SQL Server from the same computer), the SQL Server Shared Memory protocol may lower BizTalk Server performance. You can resolve this problem by disabling the use of the Shared Memory network protocol in SQL Server Network Configuration. Also, to facilitate transactions between SQL Server and BizTalk Server, […]
Blog Post by: Sandro Pereira

ESB Toolkit 2.2 SSO Configuration Error

ESB Toolkit 2.2 SSO Configuration Error

Hello There.

If you have tried using SSO-based configuration for your ESB Config in the new ESB Toolkit 2.2, you’ve probably encountered this error:

image

This appears to be a bug, which I believe a hotfix request has been submitted for.  Please contact Microsoft Support if you are experiencing this issue, as it will help get the fix in place faster.

If you need a workaround for a NON-PRODUCTION environment, you can copy the old Microsoft.BizTalk.Interop.SSOClient.dll from the BizTalk 2013 Beta install media, and paste it into the Bin folder of the toolkit, so it is used by the ESB Configuration Tool.  Here are some details of the differences:

From BizTalk 2013 RTM:
clip_image002clip_image002[4]

From BizTalk 2013 Beta:
clip_image002[6]clip_image002[8]

Once the older version of the dll is copied over, the Configuration tool sets up SSO-based configuration correctly.
image

So keep your eyes open for a hotfix!

Cheers,
Dan

BizTalk 2013 Installation and Configuration – Install and Configure BizTalk Server 2013 (Part 9)

BizTalk 2013 Installation and Configuration – Install and Configure BizTalk Server 2013 (Part 9)

In this section you’ll install BizTalk Server, confirm that the installation succeeded, and then configure BizTalk Server. When you installed SQL Server, setup granted your account Database Administrator rights. Since these rights are also required for installing BizTalk Server, you must do one of the following: Use the same account you used when you installed […]
Blog Post by: Sandro Pereira

ESB Toolkit 2.2 Itinerary Selection Error

Hello there!

For any of you who may be using the BizTalk 2013 RTM, and the ESB Toolkit 2.2, there is an error you will encounter selecting an itinerary using either the BRI or ITINERARY-STATIC resolvers in your receive pipeline.  The following article describes the problem and the workaround.  It boils down to the esb.config file not being updated for the new Unity container, so the XML structure is now incorrect.  This article provides a nice download of the modified configuration fileand I’ve verified it works!

http://www.ithero.nl/post/2013/05/12/How-to-fix-the-error-Exception-has-been-thrown-by-the-target-of-an-invocation-when-using-the-BRI-resolver-in-the-ESB-Toolkit-and-BizTalk-2013.aspx

Cheers,
Dan

Duplicate key row in object ‘dbo.bts_LogShippingJobs’

A customer was setting up Log Shipping disaster recovery for their BizTalk Server 2006 R2 environment. http://technet.microsoft.com/en-us/library/aa560961(v=bts.20).aspx  They had gotten to the step where they run the stored procedure bts_ConfigureBizTalkLogShipping. They received the error “Msg 2601, Level 14, State 1,  Procedure bts_ImportSQLAgentJobs, Line 56 Cannot insert duplicate key row in object ‘dbo.bts_LogShippingJobs’ with unique index ‘CIX_LogShippingJobs’.”

It turns out that this customer also had several of their own SQL Agent jobs running on the BizTalk server. As part of configuring the destination environment, we attempt to recover all of the jobs running on the BizTalk server with one exception: we don’t support importing of SQL Agent jobs where the steps use more than one database. The script’s logic iterates over each database to be recovered and logs the jobs associated with that database in bts_LogShippingJobs for later recovery. If a job has more than one database association, the script attempts to log it twice. But we never want to recover the same job more than once, so bts_LogShippingJobs doesn’t allow duplicate jobs. When the script attempts to log the job the second time, it fails.

In general, we discourage running any jobs on the SQL Server that supports BizTalk Server other than the jobs that ship with the product.

For those customers who choose to run their own jobs and encounter this issue, the solution is to temporarily remove any job(s) associated with multiple databases while setting up the destination recovery environment. You will also need to develop your own recovery plan for any such job(s). Once the destination environment is configured, you can restore the jobs. The following T-SQL will identify jobs associated with multiple databases:

SELECT j.name, COUNT(DISTINCT js.database_name) AS dbcount
INTO #tmp FROM msdb.dbo.sysjobsteps js
JOIN msdb.dbo.sysjobs j WITH (NOLOCK)
ON j.job_id = js.job_id
GROUP BY j.name
SELECT * FROM #tmp WHERE dbcount > 1
DROP TABLE #tmp

BizTalk 2013 Installation and Configuration – Configure SQL Server Database Mail feature (Part 8)

BizTalk 2013 Installation and Configuration – Configure SQL Server Database Mail feature (Part 8)

If your BizTalk Server environment uses SQL Server 2012 and you wish to configure BAM Alerts, you must have already configured SQL Server Database Mail feature before you try to configure BAM Alerts, otherwise the BizTalk Basic configuration will ignore this feature (BAM Alerts). Database Mail is an enterprise solution for sending e-mail messages from […]
Blog Post by: Sandro Pereira

Workflow Manager Installation Error: Adding host to Service Bus Farm

I was trying to get the new workflow engine installed and configured with SharePoint 2013 but was encountering an error in the Service Bus configuration step.
Installation Instruction for Workflow Manager and SharePoint 2013 Workflow Configuration:
The full installation steps are provided here (in a video series): http://technet.microsoft.com/en-us/library/dn201724.aspx and here: http://technet.microsoft.com/en-us/library/jj193478
The instructions provided above are pretty detailed […]
Blog Post by: kadasani

BizTalk 2013 Installation and Configuration – Install SQL Server 2012 (Part 7)

BizTalk 2013 Installation and Configuration – Install SQL Server 2012 (Part 7)

BizTalk Server provides the capability to specify a business process and also a mechanism by which the applications used in that business process can communicate with each other. SQL Server is the main repository for this communication mechanism. For optimal performance, Microsoft recommends using the Enterprise Edition of SQL Server. Note: Using SQL Server Express […]
Blog Post by: Sandro Pereira