Customer story: BizTalk management through PowerApps

Customer story: BizTalk management through PowerApps

It’s been a month since Feature Pack 1 was released for BizTalk Server 2016 Enterprise and Developer edition. The Feature pack introduced a set of new features, and helped customers leverage new technologies as well as taking advantage of tools they already used in their organization, in this case to enable a more streamlined management of their BizTalk Installation.

One of these customers were FortisAlbert, an energy company located Alberta, Canada. With the new management REST APIs with full Swagger support they were able to take parts of the operational management of the environments out from the BizTalk Servers and Administration Console and build a PowerApp to create and maintain their applications. Making the 24/7 support of their environment easier for the operational teams.

Having the option to add, update and even start existing artifacts directly from the PowerApp has helped FortisAlbert to enhance their productivity and speed of resolving live incidents.

“FortisAlberta has been using BizTalk since 2006 and is currently migrating to BizTalk 2016 due to its versatility, adaptability and ability to integrate disparate systems with ease.”

Anthony See, FortisAlberta

5-23-2017-9-09-14-am

host-instancessend-ports

SendPort is not showing in BizTalk360

SendPort is not showing in BizTalk360

BizTalk360 v8.4 is now released for public with lots of exciting new features and enhancements. Many of our customers have upgraded to the latest version and started enjoying the new features.  We at BizTalk360 support get a lot of queries in the form of tickets. Many customers are asking for the installation path, few raise some clarifications and others may be issues. We categorize the tickets as a clarification, feature requests, and bugs.

Our support team often get some strange issues which did not belong to either of these categories. I am here to explain about one such interesting case and how we identified the root cause and resolved it. As per the below quote,

“The job isn’t to just fix the problem. It is also to restore the customer’s confidence. DO BOTH!” – Shep Hyken

we, the BizTalk360 support team, always work hard to resolve the customers’ issues and achieve customer satisfaction.

The original case stated by the customer

There was a ticket from the customer stating that “Send port is not showing on BT360 Application portal”. In BizTalk360 console, the artifacts get listed when we navigate to Operations -> Application Support -> Applications. The case was that a send port was not getting listed here. But all the send ports were getting listed at the time of assigning alarm for monitoring activity. There was no issue with the other artifacts and they were getting listed properly.

Sendport information missing in BizTalk360

Backtracking and Analyzing the case with the Network Response

Generally, when there is any issue related to UI, we ask for the JSON response from the Network tab in the Developer’s console of the browser. By pressing F12, we can open the browser console and check for any exceptions in the service calls. In the Network tab of the console, the service calls for each operation gets listed from which we can get the request headers and JSON response. This way we can check for the exception details and work on the same. So, we replied to the ticket asking for the network response. But we did not get the required information from the JSON response. The next step was to go on a call with the customer involving one of our technical team members through the web meeting with a screen sharing session.

In the web meeting, we tried different scenarios to check for the send ports. We tried in the Search Artifacts section and it was getting listed without any problem. But there was a weird thing seen. There were multiple entries for the same send port with different URI configured. This is the first time we have come across such an issue. But will this be the issue for the send port not getting listed? Let’s see what’s happening.

Discrepancy in the Send Port information

We exported the send port data from the customer and checked it.  There were multiples entries for the same send port but with different transport type and protocols configured. But in BizTalk server, it does not allow us to create send ports with duplicate names. Then how come this would happen at the customer end? We started our investigation further. Then we found that the multiple entries were due to the backup transport configured for the send ports.  But this was not the cause of the issue. What a strange issue? Shall we move further with the analysis?

Discrepancy in send port information pattern

Was the DB2 Adapter causing the real problem

On further analysis on this case, we found that DB2 adapter was being used in one of the send ports and it is not a standard BizTalk adapter. The BizTalk Adapter for DB2 is a send and receive adapter that enables BizTalk orchestrations to interact with host systems. Specifically, the adapter enables to send and receive operations over TCP/IP and APPC connections to DB2 databases running on a mainframe, AS/400, and UDB platforms. Based on Host Integration Server (HIS) technology, the adapter uses the Data Access Library to configure DB2 connections, and the Managed Provider for DB2 to issue SQL commands and stored procedures.

The trace logs also indicated a NULL assignment for the Transport type for the send port with this adapter. It’s a prerequisite for BizTalk360, that BizTalk Admin components must be installed in the BizTalk360 server, in case of the BizTalk360 standalone installation. Since DB2 adapter comes with HIS, it was suggested to the customer to install HIS in the BizTalk360 server and observe for the send ports listing. But even after installing HIS, the same issue persisted. We also tried to replicate the same scenario by installing HIS with the DB2 adapter in a BizTalk360 standalone server. The send ports with different combinations of adapters in the transport types were created and tested. But the issue was not reproducible. So, we concluded that the DB2 adapter was not the real cause of the problem.

The Console App made the trick

Sometimes, the issue may seem to be simple. But identifying the root cause of the issue is very difficult. And that too for some strange issues, it would be extremely difficult if the issue is not reproducible. It might not be good to disturb the customers often since they might be busy. Our next plan was to provide a console app to get the complete details of the send ports configured. This app was quite helpful for us to find the root cause. Read further to know the real cause.

The console app was given to the customer to get the complete details of the send ports. The app would give the result in the JSON format with all the details like the name, URI configured, transport type, send handlers etc., The BizTalk application which contained the send port and the database details must be entered in the app to fetch the response.

JSON response with the sendport details for the console app

From the screenshot, we can see that the sendHandler for the secondaryTransport

does not contain the value of transport type. This was the cause for the send port not getting displayed. It was causing the exception.

Finally, the Backup Transport configuration was the cause

We probed further into the case as to why the sendHandler details were not coming up. The Backup transport was configured to “None” in the BizTalk admin console for that send port. Even though it was configured to None, we again asked them to update it again to None and then save it. This time, the issue was resolved and the send port got listed in BizTalk360 UI. It might have happened when importing the send port configuration, back up Transport Type is set to other than “None”. (Type can be empty or NULL).

If Transport type is other than None, then the code will generate the send handler and look for Transport Type. But it could not find the transport type and hence throws an error. The same issue happened in the production environment also and got resolved the same way.

To configure BackupTransport for send port in BizTalk server

Conclusion

When we import the send port configuration, we must make sure that the Backup transport type data is properly set to None. It should not be set to NULL or empty. This way we can make sure that all the send ports are getting listed in the BizTalk360 UI without any problem. We could identify this with the help of the console app.

Author: Praveena Jayanarayanan

I am working as Senior Support Engineer at BizTalk360. I always believe in team work leading to success because “We all cannot do everything or solve every issue. ‘It’s impossible’. However, if we each simply do our part, make our own contribution, regardless of how small we may think it is…. together it adds up and great things get accomplished.”

Microsoft Integration Weekly Update: May 22

Microsoft Integration Weekly Update: May 22

Do you feel difficult to keep up to date on all the frequent updates and announcements in the Microsoft Integration platform?

Integration weekly update can be your solution. It’s a weekly update on the topics related to Integration – enterprise integration, robust & scalable messaging capabilities and Citizen Integration capabilities empowered by Microsoft platform to deliver value to the business.

If you want to receive these updates weekly, then don’t forget to Subscribe!

On-Premise Integration:

Cloud and Hybrid Integration:

Feedback

Hope this would be helpful. Please feel free to let me know your feedback on the Integration weekly series.

Advertisements

Using IoT Hub for Cloud to Device Messaging

Using IoT Hub for Cloud to Device Messaging

In the previous blog posts of this IoT Hub series, we have seen how we can use IoT Hub to administrate our devices, and how to do device to cloud messaging. In this post we will see how we can do cloud to device messaging, something which is much harder when not using Azure IoT Hub. IoT devices will normally be low power, low performance devices, like small footprint devices and purpose-specific devices. This means they are not meant to (and most often won’t be able to) run antivirus applications, firewalls, and other types of protection software. We want to minimize the attack surface they expose, meaning we can’t expose any open ports or other means of remoting into them. IoT Hub uses Service Bus technologies to make sure there is no inbound traffic needed toward the device, but instead uses per-device topics, allowing us to send commands and messages to our devices without the need to make them vulnerable to attacks.

IoT Hub For Cloud To Device Messaging

Send Message To Device

When we want to send one-way notifications or commands to our devices, we can use cloud to device messages. To do this, we will expand on the EngineManagement application we created in our earlier posts, by adding the following controls, which, in our scenario, will allow us to start the fans of the selected engine.

IoT Hub For Cloud To Device Messaging

To be able to communicate to our devices, we will first implement a ServiceClient in our class.

private readonly ServiceClient serviceClient = ServiceClient.CreateFromConnectionString("HostName=youriothubname.azure-devices.net;SharedAccessKeyName=iothubowner;SharedAccessKey=yoursharedaccesskey"); 

Next we implement the event handler for the Start Fans button. This type of communication targets a specific device by using the DeviceID from the device twin.

private async void ButtonStartFans_Click(object sender, EventArgs e)
{
    var message = new Microsoft.Azure.Devices.Message();
    message.Properties.Add(new KeyValuePair<string, string>("StartFans", "true"));
    message.Ack = DeliveryAcknowledgement.Full; // Used for getting delivery feedback
    await serviceClient.SendAsync(comboBoxSerialNumber.Text, message);
}

Process Message On Device

Once we have sent our message, we will need to process it on our device. For this, we are going to update the client application of our simulated engine (which we also created in the previous blog posts) by adding the following method.

private static async void ReceiveMessageFromCloud(object sender, DoWorkEventArgs e)
{
    // Continuously wait for messages
    while (true)
    {
        var message = await client.ReceiveAsync();

        // Check if message was received
        if (message == null)
        {
            continue;
        }

        try
        {
            if (message.Properties.ContainsKey("StartFans") && message.Properties["StartFans"] == "true")
            {
                // This would start the fans
                Console.WriteLine("Fans started!");

            }

            await client.CompleteAsync(message);
        }
        catch (Exception)
        {
            // Send to deadletter
            await client.RejectAsync(message);
        }
    }
}

We will run this method in the background, so update the Main method, and insert the following code after the call for updating the firmware.

// Wait for messages in background
var backgroundWorker = new BackgroundWorker();
backgroundWorker.DoWork += ReceiveMessageFromCloud;
backgroundWorker.RunWorkerAsync();

Message Feedback

Although cloud to device messages are a one-way communication style, we can request feedback on the delivery of the message, allowing us to invoke retries or start compensation when the message fails to be delivered. To do this, implement the following method in our EngineManagement backend application.

private async void ReceiveFeedback(object sender, DoWorkEventArgs e)
{
    var feedbackReceiver = serviceClient.GetFeedbackReceiver();
    
    while (true)
    {
        var feedbackBatch = await feedbackReceiver.ReceiveAsync();

        // Check if feedback messages were received
        if (feedbackBatch == null)
        {
            continue;
        }

        // Loop through feedback messages
        foreach(var feedback in feedbackBatch.Records)
        {
            if(feedback.StatusCode != FeedbackStatusCode.Success)
            {
                // Handle compensation here
            }
        }

        await feedbackReceiver.CompleteAsync(feedbackBatch);
    }
}

And add the following code to the constructor.

var backgroundWorker = new BackgroundWorker();
backgroundWorker.DoWork += ReceiveFeedback;
backgroundWorker.RunWorkerAsync();

Call Remote Method

Another feature when sending messages from the cloud to our devices is to call a remote method on the device, which we call invoking a direct method. This type of communication is used when we want to have an immediate confirmation of the outcome of the command (unlike setting the desired state and communicating back reported properties, which has been explained in the previous two blog posts). Let’s update the EngineManagement application by adding the following controls, which would allow us to send an alarm message to the engine, sounding the alarm and displaying a message.

IoT Hub For Cloud To Device Messaging

Now add the following event handler for clicking the Send Alarm button.

private async void ButtonSendAlarm_Click(object sender, EventArgs e)
{
    var methodInvocation = new CloudToDeviceMethod("SoundAlarm") { ResponseTimeout = TimeSpan.FromSeconds(300) };
    methodInvocation.SetPayloadJson(JsonConvert.SerializeObject(new { message = textBoxMessage.Text }));

    CloudToDeviceMethodResult response = null;

    try
    {
        response = await serviceClient.InvokeDeviceMethodAsync(comboBoxSerialNumber.Text, methodInvocation);
    }
    catch (IotHubException)
    {
        // Do nothing
    }

    if (response != null && JObject.Parse(response.GetPayloadAsJson()).GetValue("acknowledged").Value<bool>())
    {
        MessageBox.Show("Message was acknowledged.", "Information", MessageBoxButtons.OK, MessageBoxIcon.Information);
    }
    else
    {
        MessageBox.Show("Message was not acknowledged!", "Warning", MessageBoxButtons.OK, MessageBoxIcon.Warning);
    }
}

And in our simulated device, implement the SoundAlarm remote method which is being called.

 
private static Task<MethodResponse> SoundAlarm(MethodRequest methodRequest, object userContext)
{
    // On a real engine this would sound the alarm as well as show the message
    Console.ForegroundColor = ConsoleColor.Red;
    Console.WriteLine($"Alarm sounded with message: {JObject.Parse(methodRequest.DataAsJson).GetValue("message").Value<string>()}! Type yes to acknowledge.");
    Console.ForegroundColor = ConsoleColor.White;
    var response = JsonConvert.SerializeObject(new { acknowledged = Console.ReadLine() == "yes" });
    return Task.FromResult(new MethodResponse(Encoding.UTF8.GetBytes(response), 200));
}

And finally, we need to map the SoundAlarm method to the incoming remote method call. To do this, add the following line in the Main method.

client.SetMethodHandlerAsync("SoundAlarm", SoundAlarm, null);

Call Remote Method On Multiple Devices

When invoking direct methods on devices, we can also use jobs to send the command to multiple devices. We can use our custom tags here to broadcast our message to a specific set of devices.
In this case, we will add a filter on the engine type and manufacturer, so we can, for example, send a message to all main engines manufactured by Caterpillar. In our first blog post, we added these properties as tags on the device twin, so we now use these in our filter. Start by adding the following controls to our EngineManagement application.

IoT Hub For Cloud To Device Messaging

Now add a JobClient to the application, which will be used to broadcast and monitor our messages.

 
private readonly JobClient jobClient = JobClient.CreateFromConnectionString("HostName=youriothubname.azure-devices.net;SharedAccessKeyName=iothubowner;SharedAccessKey=yoursharedaccesskey");

To broadcast our message, update the event handler for the Send Alarm button to the following.

 
private async void ButtonSendAlarm_Click(object sender, EventArgs e)
{
    var methodInvocation = new CloudToDeviceMethod("SoundAlarm") { ResponseTimeout = TimeSpan.FromSeconds(300) };

    methodInvocation.SetPayloadJson(JsonConvert.SerializeObject(new { message = textBoxMessage.Text }));

    if (checkBoxBroadcast.Checked)
    {
        try
        {
            var jobResponse = await jobClient.ScheduleDeviceMethodAsync(Guid.NewGuid().ToString(), $"tags.engineType = '{comboBoxEngineTypeFilter.Text}' and tags.manufacturer = '{textBoxManufacturerFilter.Text}'", methodInvocation, DateTime.Now, 10);
            
            await MonitorJob(jobResponse.JobId);
        }
        catch (IotHubException)
        {
            // Do nothing
        }
    }
    else
    {
        CloudToDeviceMethodResult response = null;

        try
        {
            response = await serviceClient.InvokeDeviceMethodAsync(comboBoxSerialNumber.Text, methodInvocation);
        }
        catch (IotHubException)
        {
            // Do nothing
        }

        if (response != null && JObject.Parse(response.GetPayloadAsJson()).GetValue("acknowledged").Value<bool>())
        {
            MessageBox.Show("Message was acknowledged.", "Information", MessageBoxButtons.OK, MessageBoxIcon.Information);
        }
        else
        {
            MessageBox.Show("Message was not acknowledged!", "Warning", MessageBoxButtons.OK, MessageBoxIcon.Warning);
        }
    }
}

And finally, add the MonitorJob method with the following implementation.

 
public async Task MonitorJob(string jobId)
{
    JobResponse result;

    do
    {
        result = await jobClient.GetJobAsync(jobId);
        Thread.Sleep(2000);
    }
    while (result.Status != JobStatus.Completed && result.Status != JobStatus.Failed);

    // Check if all devices successful
    if (result.DeviceJobStatistics.FailedCount > 0)
    {
        MessageBox.Show("Not all engines reported success!", "Warning", MessageBoxButtons.OK, MessageBoxIcon.Warning);
    }
    else
    {
        MessageBox.Show("All engines reported success.", "Information", MessageBoxButtons.OK, MessageBoxIcon.Information);
    }
}

Conclusion

By using IoT Hub we have a safe and secure way of communicating from the cloud and our backend to devices out in the field. We have seen how we can use the cloud to device messages in case we want to send one-way messages to our device or use direct methods when we want to be informed of the outcome from our invocation. By using jobs, we can also call out to multiple devices at once, limiting the devices being called by using (custom) properties of the device twin. The code for this post can be found here.

IoT Hub Blog Series

In case you missed the other articles from this IoT Hub series, take a look here.

Blog 1: Device Administration Using Azure IoT Hub
Blog 2: Implementing Device To Cloud Messaging Using IoT Hub
Blog 3: Using IoT Hub for Cloud to Device Messaging

Author: Eldert Grootenboer

Eldert is a Microsoft Integration Architect and Azure MVP from the Netherlands, currently working at Motion10, mainly focused on IoT and BizTalk Server and Azure integration. He comes from a .NET background, and has been in the IT since 2006. He has been working with BizTalk since 2010 and since then has expanded into Azure and surrounding technologies as well. Eldert loves working in integration projects, as each project brings new challenges and there is always something new to learn. In his spare time Eldert likes to be active in the integration community and get his hands dirty on new technologies. He can be found on Twitter at @egrootenboer and has a blog at http://blog.eldert.net/.

Microsoft Integrate 2017 Event

Microsoft Integrate 2017 Event

Microsoft Integrate 2017 Event

Don’t miss the greatest integration event in the world.

Name : Integrate 2017

When : 26-28 June 2017

Where : Kings Place, London

Discount Code : MVPSPEAK2017REF

Link : https://www.biztalk360.com/integrate-2017/

Topics : Microsoft BizTalk, Host Integration Server, Logic Apps, IBM Legacy Integration, and much more

Speakers : Product Group, MVP (Most Valuable Professional)

Audience : From all over the world

Advertisements

My sessions at TUGA IT 2017 | May 18th–20th | Lisbon ,Portugal

My sessions at TUGA IT 2017 | May 18th–20th | Lisbon ,Portugal

We are 3-days away of Tuga IT 2017 to return to Lisbon! And you are still on time to register for this event here. This year, in addition to organizing the integration track, I will also present 2 sessions, one focused on PowerApps and Microsoft Flows on SharePoint/Office365 track, and the second focused on Azure Enterprise Integration features: Logic Apps.

You can check the full event agenda here.

Here are the title and abstract of the sessions I will deliver at TUGA IT 2017

How can PowerApps and Microsoft Flow allows your power users to quickly build Enterprise Mobile Apps

How can PowerApps and Microsoft Flow allows your power users to quickly build Enterprise Mobile Apps will occur May, 19th at SharePoint/Office365 track between 9 AM to 10 AM.

Every organization faces constant pressure to do more with less. While technology is often the key to operating more effectively and efficiently, cost and complexity have often prevented organizations from taking maximum advantage of the potential benefits. The growth of SaaS (software as a service) has lowered barriers – no need to deploy servers or to install and configure complex software systems. Just sign up and go.

Microsoft Flow and Microsoft PowerApps will help these people (normally business users) achieve more.

We know not every business problem can be solved with off-the-shelf solutions. But developing custom solutions has traditionally been too costly and time-consuming for many of the needs teams and departments face, especially those projects that integrate across multiple data sources or require delivery across multiple devices from desktop to mobile. As a result, too many technology needs end up unsolved or under-optimized. We piece together spreadsheets, email chains, SharePoint or/and manual processes to fill in the gaps.

TUGA IT 2017: PowerApps

PowerApps and Microsoft Flow are both aimed squarely at these gaps. They give people who best understand their needs and challenges the power to quickly meet them, without the time, complexity and cost of custom software development.

In this session, we will look at these two new offering from Microsoft: PowerApps and Flow. What are they? How can I use it? But special we will walk through and create from scratch some live demos showing how to create Enterprise Mobile Application that easily connects with all your enterprise platforms like Office36, SharePoint Online, Dynamic CRM, on-premise SQL, Social Networks and much more and also how they can automate some common tasks using the new Microsoft Flow.

The Speaker Nightmare: Eval Forms, OCR, Logic Apps & Power BI

The Speaker Nightmare: Eval Forms, OCR, Logic Apps & Power BI will occur May, 20th at Enterprise Integration track between 10:20 AM to 11:30 AM

An evaluation form is something that a speaker love and hates, especially, if the results are processed in real-time and public available. If the result was excellent, then it is extremely rewarding, other times, it may “hurt” the speaker who has made himself available to share his knowledge and has been evaluated negatively. Sometimes the attendees are unfair in their evaluations, like, to basic in a 100-level session (these types of sessions are supposed to be basic or introductory) or sometimes the speaker had a bad day (it happens with everyone).

TUGA IT 2017: SmartDocumentor

(this demo will be in real time)

I speak from personal experience, is these last 6 years that I have been doing speaking at community events, in Portugal and abroad, I already been evaluated in all ways: badly, reasonable, good and excellent, sometimes I saw in, the same sessions, attendees with different profiles evaluate me badly and excellent. The key point for the speaker is:

  • All feedback is good, either negative or positive, he can learn to improve itself, if that’s the case, or that I specific topic is not good for a certain audience
  • He only needs to give his best! We cannot please everyone, and the goal is to fill happy with yourself and your performance.

I love evaluation forms and I love for them to be public available, even better if they are public available during the event. Because, at least, it will give during the event a good topic of conversation for people that do not know each other and it will keep the conversation flowing (naturally), people normally are affray or shy to start a conversation between unfamiliar persons, this is a good ice break.

In this session, I will show and explain a real live demo on how we can easily build a robust solution for process evaluation forms, using an OCR software and easily integrate them with Power BI to present the results in an interactive and beautiful way. But most important: How you can educate your enterprise Developers and IT pros users to easily extend capabilities for power users, who understand their business challenges the best, and allow them to use their familiar tools like: OCR software (SmartDocumentor) to process Evaluation forms and quickly build and deliver Power BI solutions to build Interactive Data dashboards. And at the same time integrate these tools, platforms or systems and a very quick and robust way using integrate feature on Azure, like, Logic Apps, API Apps and Azure Functions. How to start from a simple solution and evolve them enabling new functionalities.

Registration For TUGA IT 2017

TUGA IT 2017 will take place in Microsoft Portugal’s offices, in Lisbon, on May 18-20, 2017. It will feature 3 days of breakout sessions and full-day workshops by world-class speakers where IT Professionals can spend 3 amazing days checking the future of IT and also take the time to network with top-level speakers and other IT Professionals.

Registration for TUGA IT 2017 is a few euro’s or even free if you do require lunch (the fee is there to reduce waste and prevent having an abundance of food).

You can register here and I will see you there in Lisbon!

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc. He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.

My next event and sessions at TUGAIT (Lisbon)

My next event and sessions at TUGAIT (Lisbon)

TUGAIT is one of the most important community events in Europe, many speakers across all countries join the conference and most of them are Microsoft MVPs.
The community events are the higher expression of the passion for technologies because organized just for a pure scope of sharing knowledge and experience.
This year I will present 2 sessions, here the full agenda, one focused to Microsoft Azure, the second focused more to the development aspect.

In the first session, Microsoft Azure EventHubs workout session, I will present
how I see EventHubs, all the technologies normally involved with it and how to use it in real scenario.
For this session I prepared a very interesting demo which involve, EventHubs, Stream Analytics, Service Bus, Data Factory, Data Catalog, Power Bi, Logic App, On premise integration.
I will explain how to manage EventHubs, the most important challenges in a scenario using it and how to solve them.
We will compare EventHubs with the other technologies and how to use it in different scenarios like IoT , event driven integration, broadcasting messaging and more.

In the second session, How to face the integration challenges using the open source, I will present how to integrate technologies today using all the technologies available, the differences between them and the best approaches to use.
I will present the most interesting challenges about hybrid integration and messaging scenarios using Cloud and on premise options.
We will examine all the technologies stack available and how to combine them to solve integration scenarios like hybrid data ingestion, big data moving, high loading, reliable messaging and more.
We will see how to manipulate specific technologies stacks and how to use them in different ways to save cost and to obtain first class solutions.

I’m very excited because TUGAIT is a very important event and it’s a great opportunity to share your personal experience with a lot of people, have a lot of fun in a perfect community spirit.

The TUGAIT is free and you can register here

See you there then!

Author: Nino Crudele

Nino has a deep knowledge and experience delivering world-class integration solutions using all Microsoft Azure stacks, Microsoft BizTalk Server and he has delivered world class Integration solutions using and integrating many different technologies as AS2, EDI, RosettaNet, HL7, RFID, SWIFT.

Azure announcements from Microsoft Build 2017

Azure announcements from Microsoft Build 2017

microsoft build 2017

Microsoft Build (often stylized as //build/) is an annual conference event held by Microsoft, aimed towards software engineers and web developers using Windows, Windows Phone, Microsoft Azure and other Microsoft Technologies. This year’s conference was conducted in Seattle, WA from May 10 to May 12.

Unlike the previous year which concentrated mainly on Visual Studio & .Net Core, Microsoft this time concentrated mainly on AI and Azure Services. This blog provides a compilation of all the Azure announcements from the 3-day event.

Azure IoT Edge

Azure IoT Edge

IoT Edge provides easy orchestration between code and services, so they flow securely between cloud and edge to distribute intelligence across IoT devices. This leverages on Azure Stream Analytics, Microsoft Cognitive Services, and Azure Machine learning to create more advanced IoT solutions with less time and effort.

To get more info on Azure IoT Edge please click here or check out the video on channel9.

Azure Batch AI Training

On Wednesday Microsoft announced a new service called Azure Batch AI training. It uses Azure to train deep neural networks, which means that now it is possible for developers to train their AI without having to worry about hardware.

To get more info on Azure Batch AI Training please click here.

Azure Cloud Shell

Azure Cloud Shell

Microsoft has put a huge investment on Command line interface. Now you can use Azure Cloud shell from inside the azure portal!! Yes, you heard me right; now Azure portal has real bash command line interface. It is also preloaded with Azure CLI, so that you can directly use commands like azure vm list right from the portal.

Azure cloud shell is still in preview mode please click here to find more information on Azure Cloud Shell.

MySQL and PostgresSQL on Azure

mySQL and postgreSQL

With this support now developers will be able to use their favourite databases on Azure. Developers will surely appreciate this, since more options means more flexibility.

Azure Database for MySQL and Azure Database for PostgreSQL services are built on the intelligent, trusted and flexible Azure relational database platform. This platform extends similar managed services benefits like Global Azure region reach, and innovations that currently power Azure SQL database and Azure SQL Data warehouse services to the MySQL and PostgreSQL database engines.

Cosmos DB

Azure Cosmos DB

It’s Microsoft’s first globally distributed, multi-model database. Azure Cosmos DB enables you to elastically and independently scale throughput and storage across any number of Azure’s geographic regions.

To find more information on Azure Cosmos DB please click here.

Cognitive Services

Microsoft also announced new cognitive services on top of the 25 existing ones. These new services include a machine vision service, a Bing-based search engine powered by AI, a video indexer, and a new online lab where more experimental services may be unveiled.

Check out this page to find all the available cognitive services in Azure Platform.

Conclusion

And that’s actually all of it! Since Microsoft Build is a developer conference most of the feature announcement targeted developers, but these will probably influence the future of AI, since Microsoft is making it easy for developers to include the power of AI with minimal effort.

Author: Umamaheswaran Manivannan

Umamaheswaran is the Senior Software Engineer at BizTalk360 having 6 years of experience. He is a full stack developer worked in various technologies like .NET, Angular JS etc.

Microsoft Integration Weekly Update: May 15

Microsoft Integration Weekly Update: May 15

Do you feel difficult to keep up to date on all the frequent updates and announcements in the Microsoft Integration platform?

Integration weekly update can be your solution. It’s a weekly update on the topics related to Integration – enterprise integration, robust & scalable messaging capabilities and Citizen Integration capabilities empowered by Microsoft platform to deliver value to the business.

If you want to receive these updates weekly, then don’t forget to Subscribe!

On-Premise Integration:

Cloud and Hybrid Integration:

Advertisements