Slides and Lab from Global Azure Integration Bootcamp on Logic Apps and Functions

Slides and Lab from Global Azure Integration Bootcamp on Logic Apps and Functions

I had a great time presenting at the Microsoft Technology Center a few weekends ago in Manhatten covering Azure Functions and Azure Logic Apps as part of the Global Integration Bootcamp!

Global Integration Bootcamp

We had a great group of presenters and an even better group of participants.

If you are interested in the slides and lab for the session, you can download them below.  The topics covered are Azure Logic Apps, Azure Functions, and Azure Storage,

Download Slides: Logic App Cloud Adapters, Functions, and Storage

Download Lab: Lab – Logic App Cloud Adapters, Functions, and Storage

Notes for the lab: You need a hosted email account (Gmail, Outlook, Office 365, etc) and a trail Twilio account (this can be skipped if you don’t want to receive a text)

Azure Logic Apps now has support for Variables

Azure Logic Apps now has support for Variables

Windows Azure Logic Apps now have support for variables inside a logic app!

 

In order to use them, just search for Variables inside the Add Action dialog box.

You have two options:  one to initialize a variable and one to increment a variable.

Currently, Variables support Integer and Float variable types as shown in the image below.  But with all things Logic Apps, this could change later on.

 

Logic Apps Variables
Logic Apps Variable Options - Integer and Float
Logic Apps Set Variable

You can also use Math Functions when assigning and incrementing variables.

An example of this is using the Add internal function to add 5 to another existing variable.  This would look like this:

@add(variables(‘TestingVariables’),5)

You access a variable inside JSON like

@variables(‘<Variable Name>’)

One interesting point to note if that you can not use the output of a variable as the increment for the same variable.

You will get this error when you try to save the Logic App:

Failed to save logic app TestVariables. The inputs of workflow run action ‘Increment_variable’ of type ‘IncrementVariable’ are not valid. Self reference is not supported when updating the value of variable ‘Increment_variable’.
Whats next for Logic App variables

What would you like to see next for variables in Azure Logic Apps?

  • More data types?
  • Cross Logic App variable support?
  • Ability to create more than one variable at a time?
  • More options than just increment and create?
#MiddlewareFriday

#MiddlewareFriday

The purpose of this post is to talk about a side-project that I have going on with Saravana Kumar and BizTalk360. The purpose of #MiddlewareFriday is to create a video blog of new and interesting developments going on in the industry.  Each week we will publish a short video that has some new content.  The content may feature news, demos and will also highlight other activities going on in the community.  From time to time we will also bring on some guests to keep the content fresh and get some different perspectives.

For both Saravana and myself there is no direct commercial incentive in doing the show.  It really comes down to participating in a community, learning by doing, improving communication skills and having some fun along the way.

I am going to keep this post updated to keep a running list of the shows – in part to aid in search engine discovery.

Episode Title Date Tags
1 Protecting Azure Logic Apps with Azure API Management January 6, 2017 Azure API Management, Logic Apps, ServiceNow, API Apps
2

Azure Logic Apps and Service Bus Peek-Lock

January 13, 2017 Logic Apps, Service Bus, Patterns
3

Logic Apps and Cognitive Services Face API – Part 1

January 20, 2017 Logic Apps, Cognitive Service, Face API, Steef-Jan Wiggers
4 Microsoft PowerApps and Cognitive Services Face API -Part 2 January 27, 2017 PowerApps, Cognitive Services, Face API
5 Serverless Integration with Steef-Jan Wiggers February 3rd, 2017 Logic Apps, Sentiment Analysis, Slack, Azure Functions, Steef-Jan Wiggers
6 Azure Logic Apps and Power BI Real Time Data Sets February, 10, 2017 Logic Apps, Power BI connector, Sandro’s Integration stencils, Quicklearn, Global Integration Bootcamp
7 Azure Monitoring, Azure Logic Apps and Creating ServiceNow Incidents February 17, 2017 Logic Apps, Azure Monitoring, API Apps, ServiceNow, Glen Colpaert SAP, Webhook Notification BizTalk360
8 Exploring ServiceBus360 Monitoring February 24, 2017 Service Bus, BizTalk360, Community Content: Team Flow + Luis, Exception handling for Logic App Web Services Toon Vanhoutte
9 Coming Soon – SAP and Logic Apps March 3rd, 2017 TBD
Advertisements

Deploying an Azure Logic App from Visual Studios between multiple Regions

Are you working with Windows Azure Logic Apps inside Visual Studios and seen an error like this after you deploy?

The API connection ‘/subscriptions/{Subscription ID}/resourceGroups/{Resource Group Name}/providers/Microsoft.Web/connections/sql’ is a connection under a managed API. Connections under managed APIs can be used if and only if the API host is a managed API host. Either omit the ‘host.api’ input property, or verify that the specified runtime URL matches the regional managed API host endpoint ‘https://logic-apis-westus.azure-apim.net/’.

What I have found is the Logic App gets a little sticky to a Region.  It seems to like the initial region you set when you first created the Logic App.  Most of the shapes inside a Logic App are internal API calls to Microsoft hosted services.  This ends up looking like this in the JSON:

"host": {
              "api": {
                        "runtimeUrl":
https://logic-apis-eastus.azure-apim.net/apim/sql},
                         "connection": { "name": "@parameters(‘$connections’)[‘sql’][‘connectionId’]"}}

As you can see the eastus is set in the runtimeUrl of the internal API call to the SQL API.  When this is deployed to another region, at present Visual Studio does not replace this value with the correct region. 

So what happens when you deploy to another region?  Well these values get sent as-is. 

If you run the Logic App you will get an error message like seen above. 

To fix this issue it is simple, once you deploy the Logic App into a new region open it inside the Web Portal and Save It.  You do not have to do anything else.  This will adjust the runtimeUrl values to the correct region.

Happy Logic Apping!

2016 Year in Review, Looking Ahead to 2017

2016 Year in Review, Looking Ahead to 2017

It is that time of year where I like to reflect back on what the previous year has brought and also set my bearings for the road ahead.  If you are interested in reading my 2015 recap, you can find it here.

Personal

2016 was a milestone birthdate for myself and my twin brother. In order to celebrate, and try to deny getting old for another year, we decided to run a marathon in New York City.  The NYC Marathon is one of the 6 major marathons in the world so it acted as a fantastic backdrop for our celebration.  Never one to turn down an adventure, my good friend Steef-Jan Wiggers joined us for this event.  As you may recall, Steef-Jan and I ran the Berlin Marathon (another major) back in 2013.

The course was pretty tough.  The long arching bridges created some challenges for me, but I fought through it and completed the race.  We all finished within about 10 minutes of each and had a great experience touring the city.

 

Kurt, Kent and Steef-Jan in the TCS tent before the race

At the finish line with the hardware.

Celebrating our victory at Tavern on the Green in Central Park.

Speaking

Traveling and speaking is something I really like to do and the MVP program has given me many opportunities to scratch this itch. I also need to thank my previous boss and mentor Nipa Chakravarti for all of the support that she has provided which made all of this possible.

In Q2, I once again had a chance to head to Europe to speak at BizTalk360’s Integrate Event with the Microsoft Product Group.  My topic was on Industrial IoT and some of the project work that we had been working on. You can find a recording of this talk here.

On stage….

I really like this photo as it reminds me of the conversation I was having with Sandro.  He was trying to sell me a copy of his book, and I was trying to convince him that if he gave me a free copy, that I could help him sell more.  Sandro has to be one of the hardest working MVPs I know who is recognized as one of the top Microsoft Integration Gurus.  If you have ever having a problem in BizTalk, there is a good chance he has already solved it.  You can find his book here in both digital and physical versions.

BizTalk360 continues to be an integral part of the Microsoft Integration community.  Their 2016 event had record attendance from more than 20 countries.  Thank-you BizTalk360 for another great event and for building a great product.  We use BizTalk360 everyday to monitor our BizTalk and Azure services. 

On a bit of a different note, this past year we had a new set of auditors come in for SOX compliance.  For the first time, that I have experienced, the auditors were really interested in how we were monitoring our interfaces and what our governance model was.  We passed the audit with flying colours, but that was really related to having BizTalk360.  Without it, our results would not have been what they were.

Q3

Things really started to heat up in Q3.  My first, of many trips, was out to Toronto to speak at Microsoft Canada’s Annual General meeting. I shared the stage with Microsoft Canada VP Chris Barry as we chatted about Digital Transformation and discuss our experiences with moving workloads to the cloud.

Next up was heading to the south east United States to participate in the BizTalk Bootcamp. This is my third time presenting at the event.  I really enjoy speaking at this event as it is very well run and is in a very intimate setting.  I have had the chance to meet some really passionate integration folks at this meetup so it was great to catch up once again.  Thank-you Mandi Ohlinger and the Microsoft Pro Integration team for having me out in Charlotte once again.

At the Bootcamp talking about Azure Stream Analytics Windowing.

The following week, I was off to Atlanta to speak at Microsoft Ignite.  Speaking at a Microsoft premier conference like Ignite (formerly TechEd) has been a bucket list item so this was a really great opportunity for me.  At Ignite, I was lucky enough to participate in two sessions.  The first session that I was involved in was a customer segment as part of the PowerApps session with Frank Weigel and Kees Hertogh.  During this session I had the opportunity to show off one of the apps my team has built using PowerApps.  This app was also featured as part of a case study here.

On stage with PowerApps team.

Next up, was a presentation with John Taubensee of the Azure Messaging team.  Once again my presentation focused on some Cloud Messaging work that we had completed earlier in the year.  Working with the Service Bus team has been fantastic this year.  The team has been very open to our feedback and has helped validate different use cases that we have.  In addition to this presentation, I also had the opportunity to work on a customer case study with them.  You can find that document here. Thanks Dan Rosanova, John Taubensee, Clemens Vasters and Joe Sherman for all the support over the past year.

Lastly, at the MVP Summit in November, I had the opportunity to record a segment in the Channel 9 studio.  Having watched countless videos on Channel 9, this is always a neat experience.  The segment is not public yet, but I will be sure to post when it is.  Once again, I had the opportunity to hang out with Sandro Pereira before our recordings.

In the booth, recording.

Prepping in the Channel 9 studio

Writing

I continue to write for InfoQ on Richard Seroter’s Cloud Editorial team.  It has been a great experience writing as part of this team.  Not only do I get exposed to some really smart people, I get exposed to a lot of interesting topics that only fuels my career growth.  In total, I wrote 46 articles but here are my top 5 that I either really enjoyed writing or learned a tremendous amount about.

  • Integration Platform as a Service (iPaaS) Virtual Panel In this article, I had the opportunity to interview some thought leaders in the iPaaS space from some industry leading organizations.  Thank-you Jim Harrer (Microsoft), Dan Diephouse (MuleSoft) and Darren Cunningham (SnapLogic) for taking the time to contribute to this feature.  I hope to run another panel in 2017 to gauge how far iPaaS has come.
  • Building Conversational and Text Interfaces Using Amazon Lex – After researching this topic, I immediately became interested in Bots and Deep Learning.  It was really this article that acted as a catalyst for spending more time in this space and writing about Google and Microsoft’s offerings.
  • Azure Functions Reach General Availability Something that I like to do, when possible, is to get a few sound bytes from people involved in the piece of news that I am covering.  I met Chris Anderson at the Integrate event earlier in the year, so it was great to get more of his perspective when writing this article.
  • Microsoft PowerApps Reaches General Availability – Another opportunity to interview someone directly involved in the news itself.  This time it was Kees Hertogh, a Senior Director of Product Marketing at Microsoft. 
  • Netflix Cloud Migration Complete – Everyone in the industry knows that Netflix is a very innovative company and has disrupted and captured markets from large incumbents.  I found it interesting to get more insight into how they have accomplished this.  Many people probably thought the journey was very short, but what I found was that it wasn’t the case.  It was a very methodical approach that actually took around 8 years to complete.

Another article that I enjoyed writing was for the Microsoft MVP blog called Technical Tuesday.  My topic focused on Extending Azure Logic Apps using Azure Functions. The article was well received and I will have another Technical Tuesday article published early in the new year.

Back to School

Blockchain

I left this topic off of the top 5 deliberately as I will talk about it here, but it absolutely belongs up there. Back in June, I covered a topic for InfoQ called Microsoft Introduces Project Bletchley: A Modular Blockchain Fabric.  I really picked up this topic out of our Cloud queue as my boss at the time had asked me about Blockchain and I didn’t really have a good answer. After researching and writing about the topic, I had the opportunity to attend a Microsoft presentation in Toronto for Financial organizations looking to understand Blockchain.  At the Microsoft event (you can find similar talk here), Alex Tapscott gave a presentation about Blockchain and where he saw it heading.  ConsenSys, a Microsoft partner and Blockchain thought leader was also there talking about the Brooklyn Microgrid. I remember walking out the venue that day thinking everything was about to change.  And it did.  I needed to better understand blockchain.

For those that are not familiar with blockchain, simply put, it is a paradigm that focuses on using a distributed ledger for recording transactions and providing the ability to execute smart contracts against these transactions.  An underlying principle of blockchain is to address the transfer of trust amongst different parties.  Historically, this has been achieved through intermediaries that act as a “middleman” between trading partners.  In return, the intermediary takes a cut on the transaction, but doesn’t really add a lot of value beyond collecting and dispersing funds.  Trading parties are then left to deal with the terms that the intermediary sets.  Using this model typically does not provide incentives for innovation, in fact it typically does the opposite and stifles it due to complacency and entitlement by large incumbent organizations.

What you will quickly discover with blockchain is that it is more about business than technology.  While technology plays a very significant role in blockchain, if your conversation starts off with technology, you are headed in the wrong direction.  With this in mind, I read Blockchain Revolution by Alex and Don Tapscott which really focuses on the art of the possible and identifying some modern-day scenarios that can benefit from blockchain.  While some of the content is very aspirational, it does set the tone for what blockchain could become.

Having completed the book, I decided to continue down the learning path.  I wanted to now focus on the technical path.  I am a firm believer that in order for me to truly understand something, I need to touch it.  By taking the Blockchain Developer course from B9Lab I was able to get some hands on experience with the technology.  As a person that spends a lot of time in the Microsoft ecosystem, this was a good learning opportunity to get back into Linux and more of the open source community as blockchain tools and platforms are pretty much all open source.  Another technical course that I took was the following course on Udemy.  The price point for this course is much lower, so it may be a good place to start without making a more significant financial investment in a longer course.

Next, I wanted to be able to apply some of my learnings.  I found the Future Commerce certificate course from MIT.  It was a three month course, all delivered online.  There were about 1000 students, worldwide, in the course and it was very structured and based upon a lot of group work.  I had a great group that I worked with on an Energy-based blockchain startup.  We had to come up with a business plan, pitch deck, solution architecture and go to market strategy, Having never been involved in a start-up at this level (I did work for MuleSoft, but they were at more than 300 people at the time), it was a great experience to work through this under the tutelage of MIT instructors. 

If you are interested in the future of finance, aka FinTech, I highly recommend this course.  There is a great mix of Finance, Technology, Entrepreneurs, Risk and Legal folks in this class you will learn a lot.

Gary Vaynerchuk

While some people feel that Twitter is losing its relevancy, I still get tremendous value out of the platform.  The following is just an example.  Someone I follow on Twitter is Dona Sarkar, from Microsoft, I had the opportunity to see her speak at the Microsoft World Partner Conference and quickly became a fan.  Back in October, she put out the following tweet, which required further investigation on my part.

Dona’s talks, from the ones that I have seen, are very engaging and also entertaining at the same time.  If she is talking about “Gary Vee” in this manner, I am thinking there is something here.  So I start to digest some of his content.  I was very quickly impressed.  What I like about Gary is he has a bias for action.  Unfortunately, I don’t see this too often in Enterprise IT shops; we try to boil the ocean and watch initiatives fail because people have added so much baggage that the solution is unachievable or people have become disenfranchised.  I have also seen people being rewarded for building “strategies” without a clue how to actual implement them.  I find this really prevalent in Enterprise Architecture where some take pride in not getting into the details.  While you may not need to stay in the details for long, without understanding the mechanics, a strategy is just a document.  And a strategy that has not/cannot be executed is useless.

If you have not spent time listening to Gary, here are some of his quotes that really resonated with me.

  • Bet on your strengths and don’t give a f&%# about what you are not good at.
  • Educate…then Execute
  • You didn’t grow up driving, but somehow you figured it out.
  • Results are results are results
  • I am just not built, to have it dictate my one at-bat at life.
  • Document, Don’t Create.
  • We will have people who are romantic and hold onto the old world who die and we will have people that execute and story tell on the new platform who emerge as leaders in the new world.
  • I am built to get punched in the mouth, I am going spit my front tooth out and look right back at you and be like now what bitch.

If this sounds interesting to you, check out a few of his content clips that I have really enjoyed:

Looking Forward

I find it is harder and harder to do this.  The world is changing so fast, why would anyone want to tie themselves down to an arbitrary list? Looking back on my recap from last year, you won’t find blockchain or bots anywhere in that post, yet those are two of the transformative topics that really interested me in 2016.  But, there are some constants that I don’t see changing.  I will continue to be involved in the Microsoft Integration community, developing content, really focused on iPaaS and API Management.  IoT continues to be really important for us at work so I am sure I will continue to watch that space closely.  In fact, I will be speaking about IoT at the next Azure Meetup in Calgary on January 10th.  More details here.

I will also be focusing on blockchain and bots/artificial intelligence as I see a lot of potential in these spaces.  One thing you can bet on is that I will be watching the markets closely and looking for opportunities where I see a technology disrupting or transforming incumbent business models.

Also, it looks like I will be running a marathon again in 2017.  My training has begun and am just awaiting confirmation into the race.

Advertisements

Fixing the Unable to process template language expressions in action HTTP Unexpected token StartObject Error in Azure Logic Apps

Fixing the Unable to process template language expressions in action HTTP Unexpected token StartObject Error in Azure Logic Apps

I have been working a lot with Azure Logic Apps over the past month.  Since I am new to Logic Apps, I often run into silly issues that turn out to trivial to fix.  This is one of them. 

I was working with the HTTP Rest API shape to try to call a custom API, actually trying to call the Azure REST API to do an action on another Logic App – but more on that later.

I was setting Content Type and Authorization inside the Headers file as shown below:

I kept receiving this error:

Unable to process template language expressions in action ‘HTTP’ inputs at line ‘1’ and column ‘1234’: ‘Error reading string. Unexpected token: StartObject. Path ‘headers.authentication’.’.

The fix was super simple.  I had not expanded the Show Advanced Options for this shape.  Once expanded, I see the Authorization is broken out from the other Headers.  I moved the Authorization section from the Headers to here and it worked as expected!

So note to self, is something does not work as expected try expanding the Advanced Options section of the shape to see if that might help. 

 

Assigning an Integration Account to an Azure Logic App inside Visual Studio

Assigning an Integration Account to an Azure Logic App inside Visual Studio

I have been working heads down for a few weeks now with Windows Azure Logic Apps.  While I have worked with them off and on for over a year now, it is amazing how far things have evolved in such a small amount of time.  You can put together a rather complex EDI scenario in just a few hours with no up front hardware and licensing costs. 

I have been creating Logic Apps both using the web designer and using Visual Studios 2015. 

Recently I was trying to use the Transform Shape that is part of Azure Integration Accounts (still in Technical Preview).  I was able to set all the properties and manually enter a map name  Then I ran into issues. 

I found if I switched to code view I was not able to get back to the Designer without manually removing the Transform Shape.  I kept getting the following error:  The ‘inputs’ of workflow run action ‘Transform_XML’ of type ‘Xslt’ is not valid. The workflow must be associated with an integration account to use the property ‘integrationAccount’.

What I was missing was setting the Integration Account for this Logic App.  Using the web interface, it’s very easy to set the Integration Account.  But I looked all over the JSON file and Visual Studios for how to set the Integration Account for a Logic App inside Visual Studios.

With the help of Jon Fancy, it turns out it is super simple.  It is just like an Orchestration property.

To set the Integration Account for a Logic App inside Visual Studios do the following:

1. Ensure you have an Integration Account already created inside the subscription and Azure Location.

2. Make sure you set the Integration Account BEFORE trying to use any shaped that depend on it, like the Transform Shape.

3. Click anyplace in the white space of the Visual Studio Logic App.

4. Look inside the Property Windows for the Integration Account selection windows.

5. Select the Integration Account you want to use and save your Logic App.

It’s that simple! 

Enjoy.

Want Microsoft Azure resources with a view? Azure prices vary by Datacenter even in the United States

In May, I gave a session at Integrate 2016 titled Azure IaaS Essentials for the BizTalk Developer (watch online now).

In that session I outlined that prices for Azure resource vary by data center. 

In case you did not know, the price you pay for Azure resources in a US datacenter can vary from that in Brazil (expensive) , Japan, Indian, and so on. 

What is interesting though, is that event data centers in the United States have different prices. 

From what I’ve seen, the East 2 and West 2 data centers seem to have better prices than a lot of the other US data centers.

I checked some prices on Virtual Machines and Storage – not all Azure resource – but some prices were as much as 13% lower! 

If you use a lot of Azure saving up to 13% or maybe more can really add up.

The key take-a-way is to ensure you check the prices of the resources in multiple data centers if you have the ability to do so for your scenario.

The Current State of iPaaS

The Current State of iPaaS

In addition to my day job of being an Enterprise Architect at an Energy company in Calgary, I also write for InfoQ.  Writing for InfoQ allows me to explore many areas of Cloud and engage with many thought leaders in the business.  Recently, I had the opportunity to host a Virtual Panel on Integration Platform as a Service (iPaaS).

Participating in this panel was:

  • Dan Diephouse – Director of Product Management at MuleSoft where he is part of the team that launched MuleSoft’s iPaaS offering: CloudHub.
  • Darren Cunningham – Vice President of Marketing at SnapLogic where he focuses on product management and outbound product marketing.
  • Jim Harrer – Principal Group Program Manager in the Cloud & Enterprise division, at Microsoft, where his team is responsible for Program Management for BizTalk Server and Microsoft’s iPaaS offering: Azure Logic Apps.

Overall, I was very happy with the outcome of the article.  I think the panelists offered some great insight into the current state of iPaaS and where this paradigm is headed.

You can read the entire article here and feel free to add comments in the article’s comment section.

Advertisements

Azure Logic Apps–Deleting Items From SharePoint (Online) List

Azure Logic Apps–Deleting Items From SharePoint (Online) List

 I have a scenario at work where we need to provide some simple syncronization between a SQL Azure table and a SharePoint Online Custom List. As a pre-requisite each morning before business users get into the office, we need to purge the contents from the SharePoint list and update it with today’s data + a 6 day forecast of future data.

I have integrated  BizTalk with custom SharePoint Lists in the past, and even wrote about it in one of my books. It wasn’t a particularly good experience so I was interested in evaluating how Logic Apps would deal with custom lists. 

One difference between BizTalk and Logic Apps, in this case, is that BIzTalk has a SharePoint Adapter but it will only interface with SharePoint Document Libraries. If you wanted to integrate BizTalk with SharePoint Custom lists you are likely going to do so with the Lists.asmx web service.  While it is completely possible to use this custom web service approach, be prepared to spend a few hours (if you are lucky) getting everything working.

With Logic Apps, it is a very different experience.  From the Logic Apps canvas you need to add a trigger to kick off your workflow.  In my case, I used a Recurrence trigger that will run every day at 7:45 am.  I can also kick this trigger off manually in the Azure Portal if I wish.

Next, I want to add an action and then search for SharePoint from the Microsoft managed APIs dropdown.  After we do that, all available SharePoint Online operations will be displayed.

In my case, I want to purge all items, but there is no Delete List operation.  Instead, I need to get all items in my list first, so that I can use the ID from each record to delete that item.  In my scenario I expect 7 days * 24 hourly records (168 items) to be in my list at any given time so this load is not a concern.

With this situation in-mind, I will select the Get Items  operation.

Once I have selected my Get Items operation, I need to establish my connection to my Office 365 subscription. (I will spare you from the password prompts)

With my connection established, I now need to provide the URL to my SharePoint List.  I can also specify optional parameters that control the number of items returned. 

I must say the experience in this dialog is a good one.  I can click on the Site URL dropdown and all of the sites that I have access to will be in that list.  Once I have selected my URL and then click on the List Name dropdown, I then see all the lists that I have access to on that site.

Next, I need to add another activity and this time I will select the Delete Item operation.

I have a similar experience in the Delete Item dialog that I had in the Get Items dialog.  With my connection already established, I need to provide the same Site URL  and the same List Name.  What is different this time is I need to provide an ID for the list item that I would like to delete.  In this case it will be an ID  that is coming from my Get Items response.

You might be asking yourself – will how is that going to work for all of my items in my list?  Don’t you need a for loop to iterate through the Get Items collection? The answer is Yes, but the Logic Apps team has made this very simple – they have done the heavy lifting for you.  If you go into Code View you can see it there:

foreach”: “@body(‘Get_items’)[‘value’]”,
“inputs”: {
    “host”: {
        “api”: {
            “runtimeUrl”: “https://logic-apis-westus.azure-apim.net/apim/sharepointonline”
        },
        “connection”: {
            “name”: “@parameters(‘$connections’)[‘sharepointonline’][‘connectionId’]”
        }
    },
    “method”: “delete”,
    “path”: “/datasets/@{encodeURIComponent(encodeURIComponent(string(‘https://SharePointsharepoint.com/sites/HOPOC/Bighorn/SitePages/Home.aspx’)))}/tables/@{encodeURIComponent(encodeURIComponent(string(‘c28b1ea2-e2a0-4faf-b7c2-3eerec21a8b’)))}/items/@{encodeURIComponent(string(item()[‘ID’]))}”
},

Awesome!

The end result is a Logic App that looks like this:

I can now my Logic App from the Azure Portal by clicking on Select Trigger and then recurrence.

I can follow my execution and dive into my Get Items and Delete Items calls.  By inspecting my inbound and outbound traces I can see the exact payloads and HTTP Status codes from the underlying operations.  In this case I was able to delete 300 items in 23 seconds.  For each item in my collection, a Delete Item call is made.

Conclusion

It honestly took me about 10 minutes to figure this out.  Part of the reason why I am writing about this is I know how long this would take with BizTalk. I anticipate it would take at least 3 hours to do this in BizTalk if it was your first time.  So this isn’t a knock against BizTalk, as Logic Apps has been built, IMO, for these lightweight scenarios with little friction.

In the short term I think Microsoft Integration architects and developers will have many opportunities like this one where you can choose one tool or the other.  For me, developer productivity needs to be part of that equation.  Where the systems that you are integrating are hosted will also play a role.  In this case, I am connecting to SharePoint Online and SQL Azure so it also doesn’t make sense, IMO, to route all of this info back on-premises only to go back up to the cloud.

We may also see a Hybrid approach where a BizTalk process can call out to a Logic App where you can take advantage of these new Logic App connectors.  This was something that Microsoft discussed at 2016 Integrate Event in London.

In a future blog post we will talk about the other half of this solution which is how to create items in SharePoint Online from SQL Azure.

Advertisements