Azure Functions: .NET 8 Functions worker is missing from the Azure Function project in Visual Studio 2022

Azure Functions: .NET 8 Functions worker is missing from the Azure Function project in Visual Studio 2022

It is always fun to return to one of my favorite topics: Errors and warnings, causes and solutions, this time on Azure. It was not a fun night yesterday! And all of that because of the nightmare that sometimes I needed to prepare our developer tools to be fit to develop Azure Services.

Some time ago, November 14, 2023, to be exact, Microsoft announced the global availability of Azure Functions supports .NET 8 in the isolated worker model. Support is available for Windows and Linux on the Consumption, Elastic Premium, and App Service plan hosting options.

So, last night, I decided to create an Azure Function using .NET 8, and to my surprise, the .NET 8 Isolated (Long Term Support) Function worker was missing from the Azure Function project in Visual Studio 2022.

Of course, the first assumption I made was I didn’t have the .NET 8 installed. And, in fact, it wasn’t! So I went to the Download .NET SDKs for Visual Studio page and installed the .NET 8.0.

However, after installing and restarting everything, including the laptop, the .NET 8 Function worker was still missing from the Azure Function project in Visual Studio 2022.

Causes

The cause of this problem or behavior is related to having currently installed obsolete Azure Function tools and/or Visual Studio version on your machine.

Solutions

The solution to this problem is not quite as intuitive as we like it but nevertheless simple to accomplish once you know what to do.

To resolve this issue/behavior, you need to perform several steps.

Step 1: As I mentioned above, make sure you have installed in your machine the .NET 8.0 SDK.

Step 2: Visual Studio version matters! You need to upgrade your Visual Studio 2020 to version 17.8 or above. (I am not sure if the first version to support this Function Worker is 17.8, but I believe so). To upgrade Visual Studio you need to:

  • Open Visual Studio 2022 and access the following menu option: Help > Check for Updates.
  • In my case, I had an old version, 17.6.3, that does not support .NET 8.0 Function worker. Click Update.
  • Once the update is complete, click OK.

Step 3: Install the latest Azure Functions toolsets and templates. After upgrading the Visual Studio, we also need to make sure that we have the latest Azure Functions toolsets and templates. To do that, we need to:

  • Open Visual Studio 2022 and access the following menu option: Tools> Options...
  • In the Options window, on the left menu three, select the option Projects and Solutions > Azure Functions. And then click Check for updates.
  • If it exists updates, click on Download & Install.
  • Once the version is installed or if there isn’t any update, a message will appear saying Azure Functions toolsets and templates are up to date. Click OK.
  • Close Visual Studio.

Step 4: (optional but recommended) Remove AzureFunctionsTools folder from LocalAppData

Maybe this step is not required, but after all the failed attempts, this “recipe” worked, at least for me. That being said, I also recommend carrying out all the optional steps.

To accomplish this, you need to:

  • On the file system, access to: %LocalAppData%
  • And delete the AzureFunctionTools folder.

Step 5: Restart the machine

I try to access Visual Studio before restarting the machine without success. Once I restarted everything, including the machine, I once again tried to access Visual Studio to create an Azure Function project, and the .NET 8 Isolated (Long Term Support) Function worker was finally available for me to use:

Hope you find this helpful! So, if you liked the content or found it helpful and want to help me write more content, you can buy (or help buy) my son a Star Wars Lego! 

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc.

He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.
View all posts by Sandro Pereira

Building BizTalk Solution Error: Couldn’t process file  due to its being in the Internet or Restricted zone or having the mark of the web on the file.

Building BizTalk Solution Error: Couldn’t process file due to its being in the Internet or Restricted zone or having the mark of the web on the file.

One error never comes alone! Following the error reported in my last blog post while working last week with one of my clients, I was able to catch an error that I never saw during these long years working with BizTalk Server while trying to build a BizTalk Server Visual Studio solution in this specific case a custom pipeline component:

Couldn’t process file XMLAttributesStripper.resx due to its being in the Internet or Restricted zone or having the mark of the web on the file. Remove the mark of the web if you want to process these files.??????????????????

Indeed I download that resource from the Internet, from my GitHub page, like I do thousands of times for many clients and projects!

Cause

This issue happens due to the fact that you downloaded from the web these files/resources from a machine with security restrictions configured. So, when Visual Studio attempts to build the project, this error occurs because The .NET Framework resource compiler honors this marker and refuses to compile those resource files for security reasons.

The underlying cause is that the respective resource file has the so-called mark of the web applied to it. This is a marker that browsers place on downloaded files so that other applications can make informed decisions on whether to trust that file or not.

Solution

To fix this issue, the solution is quite simple. Nevertheless, there are many ways to solve or avoid this issue.

Solution 1: Fix the issue

To solve this issue, we need to remove the mark of the web, to do that, we need to:

  • Right-click on the file in windows explorer and select Properties.
  • On the General tab, at the bottom under Security, there is a check box to remove mark of the web.
  • Unselect the Unblock check box and click OK.

Note: This needs to be done with Visual Studio closed.

Solution 2: Fix the issue with PowerShell script

We can also do the same functionality as Solution 1 using the following PowerShell script:

  • On the folder for the project, run the following script
dir -Path . -Recurse | Unblock-File
  • or
Get-ChildItem -Path . -Recurse | Unblock-File

Solution 3: Fix the issue from Visual Studio

Didn’t try this approach, but apparently, we can also fix this issue directly from Visual Studio by:

  • Select the menu option Tools > Options
  • From the Options windows, select the option Trust Settings under Environment and add the project path as a trusted path.

Credits to Cody J. Mathis

Solution 4: Avoid this issue from happening again

Didn’t try this approach, but apparently, we can also fix this issue directly from Visual Studio by:

  • Open Internet Explorer and click on the gear in the top right corner.
  • Click Internet Options. and select the Security tab.
  • Click Trusted Sites and then on Sites button.
  • Enter the URL of the site you want to trust, in this case, GitHub, and click Add.
  • Click OK.
BizTalk Server tips and tricks for developers: How to create a .NET class from a schema

BizTalk Server tips and tricks for developers: How to create a .NET class from a schema

Sometimes we want to bypass the adapters and perform the communication thru .NET code. Sometimes we want to access multiple values, or recursive values, of a message inside a helper class that supports an orchestration. Sometimes we want to perform a complex transformation or construct a message thru .NET code instead of a BizTalk Server map.

And for each of these circumstances, we can have several approaches, one that is simple, effective, and transversal to all of them is to use the XML Schema to generate a .NET class. This way, you have a quick and straightforward way to represent the messages you get from your BizTalk processes into a .NET class.

The easy way to generate classes that conform to a specific schema is to use the need to use the XML Schema Definition tool (Xsd.exe). You can do that by:

  • Open a Developer Command Prompt for VS <edition>
  • On the command prompt, navigate to the Schema folder and type
    • xsd /classes /language:CS Schema1.xsd

For complex schemas that import or include other schemas you need to provide all the dependencies, for example:

  • xsd /classes /language:CS Schema1.xsd Schema2.xsd

Now you will be able to import the C# class generated by the tool to your helper project and the rest is simple!

On the Orchestration you can create a variable that represents that class:

  • On property Type select <.NET Class…> and select the class that you created previously
  • From the Browse and Select .Net Type to reference, you need to select the proper assembly and the Type.

Then by using C# code inside Message Assign or Expression Shape you can serialize or deserialize XML document into C# and vice versa in a simple and straightforward way

//CONVERT MESSAGE INTO C# OBJECT
varPersonMsg = msgInput;

//CONVERT C# OBJECT INTO BIZTALK MESSAGE
msgOutput4 = varPersonMsg;

or if we have a function in a helper class like:

public static XmlDocument MapPersons(Person person)
{
  ...
}

we can directly pass the message to the function without the need to create a variable:

msgOutput4 = Support.Mapping.MapPersons(msgInput);

The post BizTalk Server tips and tricks for developers: How to create a .NET class from a schema appeared first on SANDRO PEREIRA BIZTALK BLOG.

Automate IBM MQ object creation with PCF in .NET using IKVM.NET

Automate IBM MQ object creation with PCF in .NET using IKVM.NET

TL;DR – There is no decent .NET support for PCF in the IBM client, but we can use IKVM.NET to convert JARs to DLLs so we can still use .NET instead of JAVA to use PCF.

Intro

Programmable Command Formats (PCFs) define command and reply messages that can be used to create objects (Queues, Topics, Channels, Subscriptions,…) on IBM Websphere MQ. For my current project we wanted to build a custom REST API to automate object creation based on our custom needs. The IBM MQ REST API was not a possible alternative at that moment in time.

The problem

The .NET PCF namespaces are not supported/documented by IBM and do not provide the possibility to inquired the existing subscriptions on a queue manager. All other tasks we wanted to automate are possible in .NET. Using JAVA seemed to be the only alternative if we wanted to build this custom REST API with all features.

Action PCF Command Result Info
Create Local/Alias Queue MQCMD_CREATE_Q OK
Delete Queue MQCMD_DELETE_Q OK
List Queues MQCMD_INQUIRE_Q OK
Purge Queue MQCMD_CLEAR_Q OK
Create Subscription MQCMD_CREATE_SUBSCRIPTION OK
List Subscriptions MQCMD_INQUIRE_SUBSCRIPTION NOK Link1
  Link2
Delete Subscription MQCMD_DELETE_SUBSCRIPTION OK

Being able to use the .NET platform was a requirement at that time, because the whole build and deployment pipeline was focused on .NET.

IKVM.NET

After some searching I stumbled upon IKVM.NET:

IKVM.NET is a JVM for the Microsoft .NET Framework and Mono. It can both dynamically run Java classes and can be used to convert Java jars into .NET assemblies. It also includes a port of the OpenJDK class libraries to .NET.“

Based on this description it sounded like it could offer a possible solution!

Using IKVM.NET we should be able to convert the IBM JARs to .NET assemblies and use the supported and documented IBM Java Packages from a .NET application.

From JAR to DLL

Now I will shortly explain how we were able to put it all together. Using IKVM.NET is not that easy when you use it for the first time. The whole process consists basically out of 3 steps:

  1. Download (and Install) the IBM MQ redistributable client (in order to extract the JAR files)
  2. Convert JARs to DLLs with IKVM.NET
  3. Copy DLLs and Reference in .NET project
    • The IBM Converted JARs and the IKVM.NET Runtime dlls

Convert JAR to DLL

Download IKVM: https://sourceforge.net/projects/ikvm/
Extract the IKVM files (c:toolsIKVM)
I have the IBM client installed, so the JAR files will be on there default installation (C:Program FilesIBMMQjavalib)

Open up a Command Prompt:

1
2
3
4
5
set path=%path%;c:toolsIKVMbin
 
cd C:Program FilesIBMMQjavalib
 
ikvmc -target:library -sharedclassloader { com.ibm.mq.jar } { com.ibm.mq.jmqi.jar } { com.ibm.mq.headers.jar } { com.ibm.mq.pcf.jar }

You will find the output in the source directory of the JAR files:

Add References…

Now that we have our DLLs, we can add them to our .NET project. This seemed less easy then I thought, I spend a lot of time figuring out what dependencies I needed. In the end, this was my result:

1 = The IBM JARs converted to DLLs
2 = The IKVM.NET runtime DLLs

MqPcfAutomation Sample

To help you get started, I added my sample proof of concept solution to GitHub in the MqPcfAutomation repository.

In the sample a showcase the functionality described in the table at the beginning of this post.

Conclustion

In the end I am happy that I was able to build a solution for the problem. The question is if this approach is advised…
I don’t think IBM approves this approach, but it works for what we need it. We are using this solution now for more then 6 months without any issues. In the future we might be able to move to the IBM MQ REST API as more features will be added.

Route Blob storage events to multiple subscribers using Azure Event Grid

Route Blob storage events to multiple subscribers using Azure Event Grid

A few weeks ago Azure Event Grid service became available in preview. This service enables centralized management of events in a uniform way. It’s scales with you when the number of events increases. And this is made possible by the foundation the event grid relies on service fabric. Not only does is auto scale you also do not have to provision anything beside a Event Topic to support custom events (see my blog Routing an Event with a custom Event Topic). Event Grid is Serverless, you only pay for each action (Ingress events, Advanced matches, Delivery attempts, Management calls). Moreover, the price will be 30 cents per million action in preview, and will be 60 cents once the service will be GA.

Azure Event Grid can be described as an event broker that has one of more event publishers and subscribers. Event publishers are currently Azure blob storage, resource groups, subscriptions, event hubs and custom events. More will be added in the coming months like IoT Hub, Service Bus, and Azure Active Directory. Subsequently, there are consumers of events (subscribers) like Azure Functions, Logic Apps, and WebHooks. And more will be added on the subscriber side too with Azure Data Factory, Service Bus and Storage Queues for instance.

Azure Event Grid Storage registeration

Currently to capture Azure Blob Storage events you will need to register your subscription through a preview program. Once you have registered your subscription, which could take a day or two you can leverage Event Grid in Azure Blob Storage only in Central West US!

The Microsoft documentation on Event Grid has a section “Reacting to Blob storage events”, which contains a walkthrough to try out the Azure Blob Storage as an event publisher.

Azure Event Grid Storage Account Events Scenario

Having registered my subscription to the preview program I started exploring its capability as in the landing page of Event Grid sample scenario’s were explained. And I wanted to try out the serverless architecture sample, where one can use Event Grid to instantly trigger a serverless function to run image analysis each time a new photo is added to a blob storage container. Hence, I build a demo according to the diagram below.

An image will be uploaded to a Storage blob container, which will be the event source (publisher). Subsequently, the Storage blob container belongs to a Storage Account containing the Event Grid capability. And the Event Grid has three subscribers, a WebHook (Request Bin) to capture the output of the event, a Logic App to notify me a blob has been created and an Azure Function that will analyze the image created in the blob storage, by extracting the URL from the event and use it to analyze the actual image.

Intelligent routing

The screenshot below depicts the subscriptions on the events on the Blob Storage account. The WebHook will subscribe to each event, while the Logic App and Azure Function are only interested in the BlobCreated event, in a particular container(prefix filter) and type (suffix filter).

Besides being centrally managed Event Grid offers intelligent routing, which is the core feature of Event Grid. And you can use filters for event type, or subject pattern (pre- and suffix). Moreover, the filters are intended for the subscribers to indicate what type of event and/or subject they are interested in. When we look at our scenario the event subscription for Azure Functions is as follows.

  • Event Type : Blob Created
  • Prefix : /blobServices/default/containers/testcontainer/
  • Suffix : .jpg

The prefix, a filter object, looks for the beginsWith in the subject field in the event. And the suffix looks for the subjectEndsWith in again the subject. In the event above you see that the subject has the specified Prefix and Suffix. See also Event Grid subscription schema in the documentation as it will explain the properties of the subscription schema. The subscription schema of the function is as follows:

The Azure Function is only interested in a Blob Created event with a particular subject and content type (image .jpg). And this will be apparent once you inspect the incoming event to the function.

The same intelligence applies for the Logic App that is interested in the same event. The WebHook subscribes to all the events and lacks any filters.

The scenario solution

The solution contains of a storage account (blob), registered subscription for Event Grid Azure Storage, Request Bin (WebHook), a Logic App and a Function App containing a function. The Logic App and Azure Function subscribe to BlobCreated event with the filter settings.

The Logic App subscribes to the event once the trigger action is defined. The definition is shown in the picture below.

Note that the resource name has to be specified explicitly (custom value) as the resource type Microsoft.Storage has be set explicitly too. The resource types that are listed are Resource Groups, Subscriptions, Event Grid Topics and Event Hub Namespace as Storage is still in a preview program. With this configuration the desired events can be evaluated and processed. In case of the Logic App it is parsing the event and sending an email notification.

Azure Function Storage Event processing

The Azure Function is interested in the same event. And as soon as the event is pushed to Event Grid once a blob has been created it will process the event. The url in the event https://teststorage666.blob.core.windows.net/testcontainer/NinoCrudele.jpg will be used to analyze the image. The image is a picture of my good friend Nino Crudele.

This image will be streamed from the function to the Cognitive Services Computer Vision API. The result of the analysis can be viewed in the monitor tab of the Azure Function.

The result of the analysis that Nino is smiling for the camera with confidence. The Logic App will parse the event and sent an email. The Request Bin will show the raw event. And in case I deleted the blob than this will only be captured by the WebHook (Request Bin) as it is interested in any event on the Storage account.

Summary

Azure Event Grid is unique in its kind as now other Cloud vendor has this type of service that can handle events in a uniform and serverless way. Although it is still early days as this service is in preview a few week. However, with expansion of event publishers and subscribers, management capabilities and other features it will mature in the next couple of months. The service is currently only available in the West Central US and West US, yet over course of time it will become available in every region. And once it will become GA the price will increase.

Working with Storage Account as source (publisher) of events unlocked new insights in the Event Grid mechanisms. Moreover, it shows the benefits of having a managed service by Azure for events. And the pub-sub and push of events are the key differentiators towards the other two services Service Bus and Event Hubs. No longer do you have to poll for events and/or develop a solution for it. To conclude the Service Bus Team has completed the picture for messaging and event handling.

Author: Steef-Jan Wiggers

Steef-Jan Wiggers is all in on Microsoft Azure, Integration, and Data Science. He has over 15 years’ experience in a wide variety of scenarios such as custom .NET solution development, overseeing large enterprise integrations, building web services, managing projects, designing web services, experimenting with data, SQL Server database administration, and consulting. Steef-Jan loves challenges in the Microsoft playing field combining it with his domain knowledge in energy, utility, banking, insurance, health care, agriculture, (local) government, bio-sciences, retail, travel and logistics. He is very active in the community as a blogger, TechNet Wiki author, book author, and global public speaker. For these efforts, Microsoft has recognized him a Microsoft MVP for the past 7 years. View all posts by Steef-Jan Wiggers

A lap around Azure Functions, go serverless!

A lap around Azure Functions, go serverless!

Serverless is hot and happening. Hence, it is not a buzzword, but a new interesting part of Computer Science, which is amazing and also a driver of the second machine age, which we are currently experiencing. I read two books sequentially recently: Computer Science Distilled and the Second Machine Age.

The first book dealt with the concepts of Computer Science. And few aspects in it caught my attention like breaking a problem into smaller pieces. Hence, in Azure I could use functions to solve partial of a complete problem or process parts of a large workload. The second book discusses the second machine age around automation, robotics, artificial intelligence and so on. And little repetitive tasks can be build using Functions. Azure Functions to be precise that can automate those little tasks. Thus, why not consolidate my little research of the current state of Azure Functions into a blog post with the context of both books in the back of my mind.

Serverless

Serverless computing is a reality and Microsoft Azure provides several platform services that can be provisioned dynamically. Resources are allocated without you worrying about scale, availability and security. And the beauty of it all is you only pay what you use.

Azure Functions is one of Microsoft’s serverless capabilities in Azure. Functions enable you to run pieces of code in Azure. Cool eh! And can be run independently, in orchestration or flow (durable functions), or as a part of a Logic App definition or Microsoft Flow.

You provision a Function App, which acts as a container for one or more functions. Subsequently, either attach a price plan to it, when you want share resources with other services like web app or you choose a consumption plan (pay as you go).

Finally, you have the function app available and you can start adding functions to them. Either using Visual Studio that has templates for building a function or you use the Azure Portal (Browser). Both provide features to build and test your function. However, Visual Studio will deliver intellisense and debugging features to you.

Function Types

Functions can be build using your language of choice like C#, F#, JavaScript, or Node.js. Furthermore, there are several types of functions you can build such as a WebHook + API function or a trigger based function. The latter can be used to integrate with the following Azure Services and SaaS solutions :

  • Cosmos DB
  • Event Hubs
  • Mobile Apps (tables)
  • Notification Hubs
  • Service Bus (queues and topics)
  • Storage (blob, queues, and tables)
  • GitHub (webhooks)
  • On-premises (using Service Bus)
  • Twilio (SMS messages)

The integration is based upon a binding and trigger, key concepts with Azure Functions. Bindings provide a way to connect to in- and outputs of earlier mentioned services and solutions, see Azure Functions triggers and bindings concepts.

WebHook + API function

A popular quick start template for Azure Functions is WebHook + API function. This type of function is supported through the HTTP/WebHook binding and enables you to build autonomous functions that can be (re)used is various types of applications like a Logic App.

After provisioning a Function App you can add a function easily. As shown below you can select a premade function, choose CSharp and click Create this function.

A function named HttpTriggerCSharp1 will be made available to you. The sample is easy to experiment with. I changed the given function to something new like the screenshot below.

And now it gets interesting. You can click Get Function URL as the function is publically accessible that is if you know the function key. By clicking the Get Function URL you’ll receive an URL that looks like this:

https://myfunctioncollection.azurewebsites.net/api/HttpTriggerCSharp1?code=iaMsbyhujlIjQhR4elcJKcCDnlYoyYUZv4QP9Odbs4nEZQsBtgzN7Q==

And the code resembles the default function key, which you can change through the Manage pane in the Function App blade.

Since your function is accessible you can call it using for instance postman.

The screenshot above shows an example of a call to the function endpoint . The request includes the function key (code). However, a call like above might not be as secure as you need. Hence, you can secure the function endpoint by using API Management Service in Azure. See Using API Management to protect Azure Functions (Middleware Friday) blog post. The post explains how to do that and it’s more secure!

Integrate and Monitor

You can bind Azure Storage as an extra output channel for a function. Through the Integrate pane I can add an extra output to the function. Configure the new output by choosing Azure Blob Storage, set Storage Account Connection and specify the path.

Next you have to update the Function signature with outputBlob parameter and implement the outputBlob.

Finally, you can monitor your functions through the Monitor pane, which provides you some basic insights (logs). For a more richer monitoring experience, including live metrics and custom queries, Microsoft recommends using Azure Application Insights. See also Monitoring Azure Functions.

Visual Studio Experience

Azure Functions can be build with Visual Studio. Now the templates are now available after a default installation of Visual Studio. You need download them.  Visual Studio 2017 the templates for Azure Functions are available on the marketplace. For Visual Studio 2015 read this blog post, which includes the steps I did for my Visual Studio 2015 installation.

Once the templates are available in your Visual Studio version (2015 or 2017) you can create a FunctionApp project. Within the created FunctionApp project you can add functions. Right click the project and select Add –> New Azure Function. Now you can choose what type of function you can build. You will have a similar experience as with the portal.

For instance you can create a ServiceBusTrigger Function (WindSpeedToBeaufort), which will be triggered once a message arrives on a queue (myqueue).

As a result you will see the following code once you hit Create:

Now let’s work on the function so it will resemble the diagram below:

To modify the function that does the above the necessary code is shown below:

And the json.setting needs to be renamed to local.settings.json, the function.json needs modification to:

The connection string is moved to the local.settings.json as depicted below:

Most of all this change is important, otherwise you will run into errors.

Debugging with Visual Studio

Visual Studio provides the capability to debug your custom function. Compile and start a debug instance. A command line dialog box will appear and your function is running (i.e. hosted and running).

To debug our function in this blog a message is sent to myqueue using the ServiceBus360 service.

Once the message arrives at the queue it will trigger the function. Hence, the debugging can start on the position in the code, where a breakpoint has been set.

And the result of execution will be visible in the command line dialog box:

In conclusion this is the debugger experience you will have with Visual Studio. Combined with having intellisense while developing your function.

Deployment

You have build and tested your function to your satisfaction in Visual Studio. Now it’s time to deploy it to Azure, therefore you right click the project and choose publish. A dialog will appear and you can choose AppService. Subsequently, if you are logged in with your Azure Credentials you will see based on the subscription one or more resource groups.

You can click OK and proceed with next steps to publish your function to the desired resource group –> function app. However, this will in the end not work!

As a result you will need a workaround as explained in Publishing a .NET class library as a Function App at least that’s what I found online. However, I as able to deploy it. However, I stumbled on another error in the portal:

Error:

Function ($WindSpeedToBeaufort) Error: Microsoft.Azure.WebJobs.Host: Error indexing method ‘Functions.WindSpeedToBeaufort’. Microsoft.Azure.WebJobs.ServiceBus: Microsoft Azure WebJobs SDK ServiceBus connection string ‘AzureWebJobsconnection‘ is missing or empty.

Hence, not a truly positive experience. In the end it’s missing a setting i.e. application setting of the Function App.

Anyways, another walkaround is to create add a new function to existing function app. Choose ServiceBusTrigger template, create it and finally copy the code from the local project into the template over the existing code. In conclusion this works as now you see a setting for the Service Bus connection string in the application setting and the reference in the function.json file.

Considerations

There are some considerations around Azure Function you need to be aware of. First of all the cost of execution, which determines whether you will choose a consumption or app plan. See Function Pricing and use the calculator to have a better indication of costs. Also consider some of the best practices around functions. These practices are:

  • Azure Functions should do just one task,
  • finish as quickly as possible,
  • be stateless
  • and be idempotent.

See also Optimize the performance and reliability of Azure Functions.

Finally, be aware of the fact that some features of Azure Functions are still preview like Proxies, Slots and the Visual Studio Tools.

Resources

This blog contains several links to some resources you might like to explore. An excellent starting point for a researching Azure functions is https://github.com/Azure/Azure-Functions. And if you are interested how Functions can play a role in Logic Apps have a look at this blog post: Building sentiment analysis solution with Logic Apps.

Explore Azure Functions, go serverless!

Cheers,

Steef-Jan

Author: Steef-Jan Wiggers

Steef-Jan Wiggers is all in on Microsoft Azure, Integration, and Data Science. He has over 15 years’ experience in a wide variety of scenarios such as custom .NET solution development, overseeing large enterprise integrations, building web services, managing projects, designing web services, experimenting with data, SQL Server database administration, and consulting. Steef-Jan loves challenges in the Microsoft playing field combining it with his domain knowledge in energy, utility, banking, insurance, health care, agriculture, (local) government, bio-sciences, retail, travel and logistics. He is very active in the community as a blogger, TechNet Wiki author, book author, and global public speaker. For these efforts, Microsoft has recognized him a Microsoft MVP for the past 7 years. View all posts by Steef-Jan Wiggers

TUGAIT 2017: Integration and Logic Apps

TUGAIT 2017: Integration and Logic Apps

Two more weeks and I will, once again, return to Lisbon for the TUGAIT event. Last year during the 2016 edition Azure MVP’s Sandro, Nino and myself (the three integration animals, see picture below) did a workshop and several sessions on integration i.e. BizTalk, Open Source connectivity (GrabCaster), and Hybrid  And this was no doubt a successful event and the debut of the integration track. Hence this year the track will active again!

TUGAIT 2017

The May 18th until the 20th, a variety of speakers will present on a myriad set of technologies like Xamarin, Angular, DataScience, Agile, Scrum, DevOps, Integration, DotNet, SQL Server, SharePoint, Office365, Azure and IoT.

The integration track on Saturday the 2oth will be packed with session by Azure MVP’s Sandro, Nino, Eldert, Riccardo, Tomasso and myself.

Integration in 2017: Logic Apps

Microsoft has made a leap forward with several of there Azure Services including Logic Apps. A Service that went GA end of July 2016 and evolved rapidly to maturity. Already we see a steady growing adoption of this service within enterprises. Logic Apps is not the replacement of BizTalk, its Microsoft answer to solve integration challenges in the Cloud. The Logic App connectors provide connectivity to other Azure Services and several SaaS solutions like MailChip, SalesForce and CRM online. At TUGAIT 2017 in the Integration Track you will learn more about Logic Apps.

Why attend?

“Nice there’s an integration track, but what if I like to learn about other technologies (too)?

Well you are at the right place as on the 18th there’s a full day of workshops you can choose from. On Friday there are 5 parallel tracks from which you can pick and choose. The same accounts for Saturday, including the integration track!

The event is located in one of the most beautiful, cultural cities of Portugal. It’s three days packed with content, stellar speakers and community leaders you can listen to and grab to ask questions.

Registration

Registration is a few euro’s or even free if you do require lunch (the fee is there to reduce waste and prevent having an abundance of food).

You can register here and I will see you there in Lisbon!

Cheers,

Steef-Jan

Author: Steef-Jan Wiggers

Steef-Jan Wiggers is all in on Microsoft Azure, Integration, and Data Science. He has over 15 years’ experience in a wide variety of scenarios such as custom .NET solution development, overseeing large enterprise integrations, building web services, managing projects, designing web services, experimenting with data, SQL Server database administration, and consulting. Steef-Jan loves challenges in the Microsoft playing field combining it with his domain knowledge in energy, utility, banking, insurance, health care, agriculture, (local) government, bio-sciences, retail, travel and logistics. He is very active in the community as a blogger, TechNet Wiki author, book author, and global public speaker. For these efforts, Microsoft has recognized him a Microsoft MVP for the past 6 years. View all posts by Steef-Jan Wiggers

TUGAIT 2017: Integration and Logic Apps

TUGAIT 2017: Integration and Logic Apps

Two more weeks and I will, once again, return to Lisbon for the TUGAIT event. Last year during the 2016 edition Azure MVP’s Sandro, Nino and myself (the three integration animals, see picture below) did a workshop and several sessions on integration i.e. BizTalk, Open Source connectivity (GrabCaster), and Hybrid  And this was no doubt a successful event and the debut of the integration track. Hence this year the track will active again!

TUGAIT 2017

The May 18th until the 20th, a variety of speakers will present on a myriad set of technologies like Xamarin, Angular, DataScience, Agile, Scrum, DevOps, Integration, DotNet, SQL Server, SharePoint, Office365, Azure and IoT.

The integration track on Saturday the 2oth will be packed with session by Azure MVP’s Sandro, Nino, Eldert, Riccardo, Tomasso and myself.

Integration in 2017: Logic Apps

Microsoft has made a leap forward with several of there Azure Services including Logic Apps. A Service that went GA end of July 2016 and evolved rapidly to maturity. Already we see a steady growing adoption of this service within enterprises. Logic Apps is not the replacement of BizTalk, its Microsoft answer to solve integration challenges in the Cloud. The Logic App connectors provide connectivity to other Azure Services and several SaaS solutions like MailChip, SalesForce and CRM online. At TUGAIT 2017 in the Integration Track you will learn more about Logic Apps.

Why attend?

“Nice there’s an integration track, but what if I like to learn about other technologies (too)?

Well you are at the right place as on the 18th there’s a full day of workshops you can choose from. On Friday there are 5 parallel tracks from which you can pick and choose. The same accounts for Saturday, including the integration track!

The event is located in one of the most beautiful, cultural cities of Portugal. It’s three days packed with content, stellar speakers and community leaders you can listen to and grab to ask questions.

Registration

Registration is a few euro’s or even free if you do require lunch (the fee is there to reduce waste and prevent having an abundance of food).

You can register here and I will see you there in Lisbon!

Cheers,

Steef-Jan

Author: Steef-Jan Wiggers

Steef-Jan Wiggers is all in on Microsoft Azure, Integration, and Data Science. He has over 15 years’ experience in a wide variety of scenarios such as custom .NET solution development, overseeing large enterprise integrations, building web services, managing projects, designing web services, experimenting with data, SQL Server database administration, and consulting. Steef-Jan loves challenges in the Microsoft playing field combining it with his domain knowledge in energy, utility, banking, insurance, health care, agriculture, (local) government, bio-sciences, retail, travel and logistics. He is very active in the community as a blogger, TechNet Wiki author, book author, and global public speaker. For these efforts, Microsoft has recognized him a Microsoft MVP for the past 6 years. View all posts by Steef-Jan Wiggers