Slides and Lab from Global Azure Integration Bootcamp on Logic Apps and Functions

Slides and Lab from Global Azure Integration Bootcamp on Logic Apps and Functions

I had a great time presenting at the Microsoft Technology Center a few weekends ago in Manhatten covering Azure Functions and Azure Logic Apps as part of the Global Integration Bootcamp!

Global Integration Bootcamp

We had a great group of presenters and an even better group of participants.

If you are interested in the slides and lab for the session, you can download them below.  The topics covered are Azure Logic Apps, Azure Functions, and Azure Storage,

Download Slides: Logic App Cloud Adapters, Functions, and Storage

Download Lab: Lab – Logic App Cloud Adapters, Functions, and Storage

Notes for the lab: You need a hosted email account (Gmail, Outlook, Office 365, etc) and a trail Twilio account (this can be skipped if you don’t want to receive a text)

Azure Logic Apps now has support for Variables

Azure Logic Apps now has support for Variables

Windows Azure Logic Apps now have support for variables inside a logic app!

 

In order to use them, just search for Variables inside the Add Action dialog box.

You have two options:  one to initialize a variable and one to increment a variable.

Currently, Variables support Integer and Float variable types as shown in the image below.  But with all things Logic Apps, this could change later on.

 

Logic Apps Variables
Logic Apps Variable Options - Integer and Float
Logic Apps Set Variable

You can also use Math Functions when assigning and incrementing variables.

An example of this is using the Add internal function to add 5 to another existing variable.  This would look like this:

@add(variables(‘TestingVariables’),5)

You access a variable inside JSON like

@variables(‘<Variable Name>’)

One interesting point to note if that you can not use the output of a variable as the increment for the same variable.

You will get this error when you try to save the Logic App:

Failed to save logic app TestVariables. The inputs of workflow run action ‘Increment_variable’ of type ‘IncrementVariable’ are not valid. Self reference is not supported when updating the value of variable ‘Increment_variable’.
Whats next for Logic App variables

What would you like to see next for variables in Azure Logic Apps?

  • More data types?
  • Cross Logic App variable support?
  • Ability to create more than one variable at a time?
  • More options than just increment and create?

Deploying an Azure Logic App from Visual Studios between multiple Regions

Are you working with Windows Azure Logic Apps inside Visual Studios and seen an error like this after you deploy?

The API connection ‘/subscriptions/{Subscription ID}/resourceGroups/{Resource Group Name}/providers/Microsoft.Web/connections/sql’ is a connection under a managed API. Connections under managed APIs can be used if and only if the API host is a managed API host. Either omit the ‘host.api’ input property, or verify that the specified runtime URL matches the regional managed API host endpoint ‘https://logic-apis-westus.azure-apim.net/’.

What I have found is the Logic App gets a little sticky to a Region.  It seems to like the initial region you set when you first created the Logic App.  Most of the shapes inside a Logic App are internal API calls to Microsoft hosted services.  This ends up looking like this in the JSON:

"host": {
              "api": {
                        "runtimeUrl":
https://logic-apis-eastus.azure-apim.net/apim/sql},
                         "connection": { "name": "@parameters(‘$connections’)[‘sql’][‘connectionId’]"}}

As you can see the eastus is set in the runtimeUrl of the internal API call to the SQL API.  When this is deployed to another region, at present Visual Studio does not replace this value with the correct region. 

So what happens when you deploy to another region?  Well these values get sent as-is. 

If you run the Logic App you will get an error message like seen above. 

To fix this issue it is simple, once you deploy the Logic App into a new region open it inside the Web Portal and Save It.  You do not have to do anything else.  This will adjust the runtimeUrl values to the correct region.

Happy Logic Apping!

Fixing the Unable to process template language expressions in action HTTP Unexpected token StartObject Error in Azure Logic Apps

Fixing the Unable to process template language expressions in action HTTP Unexpected token StartObject Error in Azure Logic Apps

I have been working a lot with Azure Logic Apps over the past month.  Since I am new to Logic Apps, I often run into silly issues that turn out to trivial to fix.  This is one of them. 

I was working with the HTTP Rest API shape to try to call a custom API, actually trying to call the Azure REST API to do an action on another Logic App – but more on that later.

I was setting Content Type and Authorization inside the Headers file as shown below:

I kept receiving this error:

Unable to process template language expressions in action ‘HTTP’ inputs at line ‘1’ and column ‘1234’: ‘Error reading string. Unexpected token: StartObject. Path ‘headers.authentication’.’.

The fix was super simple.  I had not expanded the Show Advanced Options for this shape.  Once expanded, I see the Authorization is broken out from the other Headers.  I moved the Authorization section from the Headers to here and it worked as expected!

So note to self, is something does not work as expected try expanding the Advanced Options section of the shape to see if that might help. 

 

Assigning an Integration Account to an Azure Logic App inside Visual Studio

Assigning an Integration Account to an Azure Logic App inside Visual Studio

I have been working heads down for a few weeks now with Windows Azure Logic Apps.  While I have worked with them off and on for over a year now, it is amazing how far things have evolved in such a small amount of time.  You can put together a rather complex EDI scenario in just a few hours with no up front hardware and licensing costs. 

I have been creating Logic Apps both using the web designer and using Visual Studios 2015. 

Recently I was trying to use the Transform Shape that is part of Azure Integration Accounts (still in Technical Preview).  I was able to set all the properties and manually enter a map name  Then I ran into issues. 

I found if I switched to code view I was not able to get back to the Designer without manually removing the Transform Shape.  I kept getting the following error:  The ‘inputs’ of workflow run action ‘Transform_XML’ of type ‘Xslt’ is not valid. The workflow must be associated with an integration account to use the property ‘integrationAccount’.

What I was missing was setting the Integration Account for this Logic App.  Using the web interface, it’s very easy to set the Integration Account.  But I looked all over the JSON file and Visual Studios for how to set the Integration Account for a Logic App inside Visual Studios.

With the help of Jon Fancy, it turns out it is super simple.  It is just like an Orchestration property.

To set the Integration Account for a Logic App inside Visual Studios do the following:

1. Ensure you have an Integration Account already created inside the subscription and Azure Location.

2. Make sure you set the Integration Account BEFORE trying to use any shaped that depend on it, like the Transform Shape.

3. Click anyplace in the white space of the Visual Studio Logic App.

4. Look inside the Property Windows for the Integration Account selection windows.

5. Select the Integration Account you want to use and save your Logic App.

It’s that simple! 

Enjoy.

Want Microsoft Azure resources with a view? Azure prices vary by Datacenter even in the United States

In May, I gave a session at Integrate 2016 titled Azure IaaS Essentials for the BizTalk Developer (watch online now).

In that session I outlined that prices for Azure resource vary by data center. 

In case you did not know, the price you pay for Azure resources in a US datacenter can vary from that in Brazil (expensive) , Japan, Indian, and so on. 

What is interesting though, is that event data centers in the United States have different prices. 

From what I’ve seen, the East 2 and West 2 data centers seem to have better prices than a lot of the other US data centers.

I checked some prices on Virtual Machines and Storage – not all Azure resource – but some prices were as much as 13% lower! 

If you use a lot of Azure saving up to 13% or maybe more can really add up.

The key take-a-way is to ensure you check the prices of the resources in multiple data centers if you have the ability to do so for your scenario.

INTEGRATE 2016 My session on Azure IaaS and Azure Training

INTEGRATE 2016 My session on Azure IaaS and Azure Training

I am excited to be presenting at the 2016 INTEGRATE conference in London.  Not only will I be able to get great pizza at Pizza Express, I will hopefully get to fill in everyone on the latest offerings in Azure IaaS.

Registration is still open for the conference.  Rates are $450 pounds per person for the 3 day event.  It is a super deal compared to other conferences.  You can get more details on registration here.

 

My session title is “Azure IaaS Essentials for the BizTalk Developer”.

The abstract is below:
Azure Infrastructure as a Service consists of Virtual Networking and Virtual Machines.  In this session Stephen will cover the essentials every developer should know about IaaS including on premise connectivity options, how to use virtual network with virtual machines, sizing options of virtual machines, and management options.  Stephen will show you how to use PowerShell to take full control of Azure Virtual Machines and make Infrastructure almost as fun as Development!  In addition, see how simple it is to build a full isolated BizTalk domain in Azure with just a few clicks.

If you are new to Azure or been out of the loop for even a few months, Michael Stephenson is putting on four “Zero-to-Cloud” sessions.  Two are before the conference and 2 after.  Each session is limited to 10 people and they are held at the BizTalk 360 office just outside of London (an easy 30 mina train ride from central London).  While I have not attended one of his classed myself, I am sure it will not disappoint!  Get more details on this even here.

 

Hope to see you in London in just a few week!!!

 

 

Pluralsight’s BizTalk Learning Path

I am excited to be part of creating the cofficia Pluralsight BizTalk Learning Path

This consists of just over 34 hours of hard core BizTalk training brought to you by Mall Milner, Mohamad Halabi, Dan Toomey, and I. 

Learning path course sequence:

BizTalk 2006 Fundamentals
Advanced BizTalk Server 2010
Intro to BizTalk Server 2013 Enterprise Service Bus (ESB) Toolkit
BizTalk Server 2013 from Ground Up: An End to End Scenario
What’s New in BizTalk Server 2013
Using Functoids in the BizTalk 2013 Mapper

Total – 34h 15m

So if you find yourself not being able to sleep at night or on 3 long 12 hour flights around the world, it is well worth the 34 hours to master your BizTalk skills

Remember: you can always watch Pluralsight courses in up to 2x speed.  That would cut the time down to almost 17 hours!

I still have a few no credit card required, 30-day unlimited free trial Pluralsight memberships available!   If interested, just fill out the form here http://www.biztalkguru.com/30-day-pluralsight-free-trial-request/

Build a Multi-Server, Configured, BizTalk 2013 R2 Domain in Windows Azure IaaS in less than 1 hour!

Build a Multi-Server, Configured, BizTalk 2013 R2 Domain in Windows Azure IaaS in less than 1 hour!

A little over 2 years ago I released a set of PowerShell scripts for auto creating a full Multi-Server BizTalk 2013 Domain inside Azure IaaS.  I spent 100’s of hours on these scripts to get them working correctly.  You can view my session at TechEd 2013 US online for more details.

I have made updates to these scripts and updated them to support BizTalk 2013 R2 and SQL 2014.  These scripts will automatically created a clean isolated 2 BizTalk Server 2013 R2 Domain using SQL 2014.  All domain user accounts are auto created.  BizTalk group and host settings are created.  The only manual intervention that is needed is to log into each server and run a PowerShell script.  In less than an hour you can have a fully configured BizTalk 2013 R2 Domain!  And since it is scripted, you can create a clean environment from scratch anytime you need one.

Note: Running these scripts will created Virtual Machines inside Windows Azure.  You will be charged for the Virtual Machines.  Make sure you understand the costs before running these scripts.

Why not use an Azure Resource Manger Template for this?  Great question!  I tried.  I spent about 40 hours trying to port these scripts into an Azure Resource Manger Template with no success.  I was unable to get the BizTalk and SQL Servers to join the domain.  Azure Resource Manger PowerShell has a new set of commands that I could of used, but they are in the process of changing – so for now keeping the scripts as-is seemed the best course of action.

You can review and download the updated scripts on MSDN – Automatic Multi-Server BizTalk 2013 R2 Domain Creation PowerShell Scripts for Azure IaaS.  If you like the scripts please rate them with a 5-star.

To run the scripts inside Visual Studio 2013 / 2015 do the following:

  1. Download them from MSDN – they are inside a Visual Studio 2013 Solution
  2. Open Visual Studio as an Administrator and open the Solution
  3. Update 3 values inside variables.ps1  It is the top 3 variables in the file.
  4. Update the top variable inside MAIN_MASTER.ps1 It is the first variable in the script.
  5. Right-click on MAIN_MASTER.ps1 and select "Execute As Script".  Wait about 45 minutes. 
  6. Log into BizTalk 02, open C:\BizTalk_Provisioning\ReadMe.docx.  Follow the instructions.  Do not close the Remote Desktop session.  The PowerShell window will open and close right away.  See details below.
  7. Log into BizTalk 01, open C:\BizTalk_Provisioning\ReadMe.docx.  Follow the instructions.  Wait about 10 minutes. Note that you have to run the last command twice.  See details below.

To run the scripts through PowerShell IDE dot he following:

  1. Download them from MSDN – they are inside a Visual Studio 2013 Solution
  2. Open Main_Master.ps1 and variables.ps1 inside PowerShell IDE, make sure to open PowerShell IDE as an Administrator
  3. Update 3 values inside variables.ps1
  4. Update the top variable inside MAIN_MASTER.ps1
  5. Run MAIN_Master.ps1 as an Administrator
  6. Once complete, Log into BizTalk 02, open C:\BizTalk_Provisioning\ReadMe.docx.  Follow the instructions.  Do not close the Remote Desktop session.  The PowerShell window will open and close right away.   See details below.
  7. Log into BizTalk 01, open C:\BizTalk_Provisioning\ReadMe.docx.  Follow the instructions.  Wait about 10 minutes. Note that you have to run the last command twice.  See details below.

This download contains the following scripts:

  • MAIN_Master.ps1 – This is the main scripts that calls all the other scripts that do all the work.
  • MAIN_RemoveVM.ps1 – This script will remove and delete everything created inside the domain.
  • MAIN_SaveMoney_StopAll.ps1 – This script to turn off all the VM’s in the domain.  You need to update 1 value in this file to run it.
  • MAIN_SpendMoney_StartAll.ps1 – This script to turn on all the VM’s in the domain.  You need to update 1 value in this file to run it.
  • variables.ps1 – Core variable file, you need to update 3 values in here.
  • Various other helper scripts and files

Detailed Post Script Configuration Steps (this is outlined in the ReadMe.docx downloaded to each server)

On BizTalk 02:

  1. Once the PowerShell scripts are complete log into the BizTalk 02 server with the domain admin account.
  2. Open a PowerShell windows as Administrator.

  3. Change directories as follows “cd c:\BizTalk_Provisioning”
  4. Run the following command “MicrosoftCloudProvisioningLocalService.exe” – you should see this running as a service.

  5. Leave the Remote Desktop Session open and go to the BizTalk 01 Server.

On BizTalk 01:

  1. Once the PowerShell scripts are complete and the service is started on BizTalk 02, log into the BizTalk 01 server with the domain admin account.
  2. Open two PowerShell windows as Administrator.

  3. In the first window, change directories as follows “cd c:\BizTalk_Provisioning”
  4. Run the following command “MicrosoftCloudProvisioningLocalService.exe” – you should see this running as a service.

  5. In the second window, change directories as follows “cd c:\BizTalk_Provisioning”
  6. Run the following command “.\Microsoft.Cloud.Provisioning.Client.exe .\multinodeconfigDemo.xml” – you should see several pop-up windows.  This should fail.

  7. Run the following command again “.\Microsoft.Cloud.Provisioning.Client.exe .\multinodeconfigDemo.xml” – you should see several pop-up windows again but this time it should Complete.

  8. The end result is a fully configured, multi-server BizTalk Domain inside Windows Azure!

Make sure you use the provided scripts to start, stop, and remove the artifacts created.  I look forward to your feedback!

BizTalk and SQL Azure Publisher Name, Offer, and SKUs for Azure Resource Manager PowerShell Module

With the introduction of Azure Resource Manager (ARM) the world of Infrastructure as a Service (IaaS) inside Azure changed – for the better.

ARM introduced a new way to organize and deploy resources across various aspects of Windows Azure.  Now you can create configuration scripts for complex Azure scenarios that include storage, web apps, virtual machines, networks, SQL databases, and more.  

With these changes introduced the AzureResourceManager PowerShell module.  I could write many posts on this along, but not to get into to many details you now have two ways to do many things in Windows Azure using PowerShell.  This gets a little more complex considering additional breaking changes are coming soon when Switch-AzureMode is deprecated and the introduction of both traditional and classic versions of some Azure Artifacts like storage and virtual machines.   Going forward I would recommend using AzureResourceManager based PowerShell commands for all new scripts.

Never the less, the way to create a new Virtual Machine is a little more complex now using AzureResourceManager.  This is due to the greatly enhanced feature set in Azure that we just did not have a few years ago.  As part of these changes, you need to call out your base image disk differently than before.  Lets take a look at the two options.

Using Azure Service Manager (ASM – default module in PowerShell right now)

$imageWindows = ‘bd507d3a70934695bc2128e3e5a255ba__RightImage-Windows-2012-x64-v13.5’
$MyDC = New-AzureVMConfig -name $vmnamePDC -InstanceSize $sizePDC –ImageName $imageWindows

Using Azure Resource Manager (ARM- will be the new default)

$VirtualMachine = Set-AzureVMSourceImage -VM $VirtualMachine –PublisherName 2012-R2-Datacenter –Offer BizTalk-Server –Skus 2013-R2-Developer –Version "latest"

With Azure Service Manger you only need to supply the operating system image name and this can easily be found doing a single PowerShell query. 
With Azure Resource Manger you need to specify a Publisher Name, Offer, SKU, and Version of the Operating System Azure Image you want to use.  It is not as easy or straight forward to get these values.

Below I have put together a quick reference list related to BizTalk Server, SQL Server, and Windows as well as the PowerShell scripts to get these values for any other operating system image.

Publisher

Offer

SKU

MicrosoftBizTalkServer

BizTalk-Server

2013-Developer

MicrosoftBizTalkServer

BizTalk-Server

2013-Standard

MicrosoftBizTalkServer

BizTalk-Server

2013-Enterprise

MicrosoftBizTalkServer

BizTalk-Server

2013-R2-Developer

MicrosoftBizTalkServer

BizTalk-Server

2013-R2-Standard

MicrosoftBizTalkServer

BizTalk-Server

2013-R2-Enterprise

MicrosoftSQLServer

SQL2008R2SP3-WS2008R2SP1

Enterprise, Standard, Web

MicrosoftSQLServer

SQL2012SP2-WS2012

Enterprise, Standard, Web *

MicrosoftSQLServer

SQL2012SP2-WS2012R2

Enterprise, Standard, Web *

MicrosoftSQLServer

SQL2014-WS2012R2

Enterprise, Standard, Web *

MicrosoftSQLServer

SQL2014SP1-WS2012R2

Enterprise, Standard, Web *

MicrosoftSQLServer

SQL2016CTP2-WS2012R2

Enterprise

MicrosoftWindowsServer

WindowsServer

2008-R2-SP1

MicrosoftWindowsServer

WindowsServer

2012-Datacenter

MicrosoftWindowsServer

WindowsServer

2012-R2-Datacenter

MicrosoftWindowsServer

WindowsServer

2016-Technical-Preview-3-with-Containers

MicrosoftWindowsServer

WindowsServer

Windows-Server-Technical-Preview

* some of the SQL SKUs contain other types of optimized versions.

If you want to get the Publisher Name, Offer, and SKU for other Azure Images you can use these PowerShell commands.

# Gets all Publishers in a specific datacenter
$locName="West US"
Get-AzureVMImagePublisher -Location $locName | Select PublisherName | Select * | Out-Gridview -Passthru

# Gets all Offers for a given Publisher in one location
$locName="West US"
$pubName="MicrosoftWindowsServer"
Get-AzureVMImageOffer -Location $locName -Publisher $pubName | Select Offer | Select * | Out-Gridview -Passthru

# Gets all SKUS for a giving Offer for a given Publisher in one location
$locName="West US"
$pubName="MicrosoftWindowsServer"
$offerName="WindowsServer"
Get-AzureVMImageSku -Location $locName -Publisher $pubName -Offer $offerName | Select Skus

Enjoy.