by Saravana Kumar | Feb 15, 2017 | BizTalk Community Blogs via Syndication
We are very happy to announce the immediate availability of BizTalk360 version 8.3. This is our 48th product release since May 2011. In version 8.3 release, we focused on bringing the following 6 features –
- Azure Logic Apps Management
- Webhooks Notification Channel
- ESB Portal Dashboard
- EDI Reporting Manager
- EDI Reporting Dashboard
- EDI Functional Acknowledgment Viewer
We have put together a release page showing more details about each feature, links to blog articles and screen shots. You can take a look here : https://www.biztalk360.com/biztalk360-version-8-3-features/
Public Webinars
We are conducting two public webinars on 28th Feb to go through all the features in version 8.3. You can register via the links below.
BizTalk360 version 8.3 Public Webinar (28th Feb – Europe)
https://attendee.gotowebinar.com/register/2736095484018319363
BizTalk360 version 8.3 Public Webinar (28th Feb – US)
https://attendee.gotowebinar.com/register/1090766493005002755
You can read a quick summary of each feature below.
Azure Logic Apps Operations
You can now manage your Azure Logic Apps right inside BizTalk360 user interface. This eliminates the need to log in to the Azure Portal to perform basic operations on the Logic Apps. Users can perform operations such as Enable, Disable, Run Trigger and Delete on the Logic App. In addition to performing the basic operations, users can view the trigger history and run history information in a nice graphical and grid view. We are making further investments in Azure Logic Apps management and monitoring, to give customers a single management console for both BizTalk and the Cloud. Read More.
Webhook Notification Channel
The new webhook notification channel allows you to send structure alert notification (JSON format) to any REST endpoints whenever there is a monitoring violation. This opens up interesting scenarios like calling an Azure Logic App and doing further processing. Read More.
ESB Dashboard
The new ESB dashboard allows you to aggregate different ESB reports into a single graphical dashboard that will help business users and support people to visualize the data in a better way. BizTalk360 has 13 different widgets that will help users to understand the ESB integrations better and help to analyze data to improve the performance. The widgets are categorized as –
- Fault code widgets (based on application, service or error type)
- Fault code over time widgets
- Itinerary Widgets
Read More
EDI Reporting Manager
The EDI reporting manager allows your to turn on/turn off reporting for each agreement in a single click. Performing this task in the BizTalk Admin Console is a tedious task especially when there are many parties and agreements. Similarly, administrators can perform bulk enable/disable operations on the NRR configuration. Administrators can add the Host Partner information required to configure the host party for NRR configurations.
EDI Reporting Dashboard
In version 8.3, out of the box, we are providing a rich EDI dashboard to aggregate different EDI transactions that help business users and support persons to visualize the data in a better way. You can also create your own EDI Reporting dashboard with three different widget categories we ship –
- EDI Interchange Aggregation widgets
- EDI Transaction Set aggregation widgets
- AS2 messaging aggregation reports
Read More
BizTalk360 has a few aggregations (as widgets) that are not available in BizTalk Admin Console reports such as Interchange Count by Agreement Name (Top 10), Interchange count by Partner Name (Top 10), Transaction count by ACK Status (Filtered by partner id), AS2 messaging aggregation reports.
EDI Functional Acknowledgement
In a standard B2B scenario, whenever a message (with multiple functional groups) is sent from source to destination, it is quite obvious that the source will expect a technical acknowledgment (TA1) and functional acknowledgment (997) for each functional group. What if the technical acknowledgment is received but not the functional acknowledgment? Taking a look at the status of functional acknowledgment is not an easy job with BizTalk Admin Console. With BizTalk360, administrators can easily view the status of functional acknowledgments within the grid and also set up data monitoring alerts whenever negative functional acknowledgments are received. Read More
Version 8.3 Launch page
https://www.biztalk360.com/biztalk360-version-8-3-features/
Version 8.3 Release Notes
http://assist.biztalk360.com/support/solutions/articles/1000244514
Author: Saravana Kumar
Saravana Kumar is the Founder and CTO of BizTalk360, an enterprise software that acts as an all-in-one solution for better administration, operation, support and monitoring of Microsoft BizTalk Server environments. View all posts by Saravana Kumar
by Nick Hauenstein | Jul 27, 2016 | BizTalk Community Blogs via Syndication
Today the Logic Apps team has officially announced the general availability of Logic Apps! We’ve been following developments in the space since it was first unveiled back in December of 2014. The technology has certainly come a long way since then, and is certainly becoming capable of being a part of enterprise integration solutions in the cloud. A big congratulations is in order for the team that has carried it over the finish line (and that is already hard at work on the next batch of functionality that will be delivered)!
Along with hitting that ever important GA milestone, Logic Apps has recently added some new features that really improve the overall experience in using the product. The rest of this post will run through a few of those things.
Starter Templates
When you go and create a new Logic App today, rather than being given an empty slate and a dream, you are provided with some starter templates with which you can build some simple mash-ups that integrate different SaaS solutions with one another and automate common tasks. If you’d still rather roll up your sleeves and dig right into the code of a custom Logic App, there is nothing preventing you from starting from scratch.
Designer Support for Parallel Actions
Ever since the designer went vertical, it has been very difficult to visualize the flow of actions whenever there were actions that could execute in parallel. No longer! You can now visualize the flow exactly as it will execute – even if there are actions that will be executing in parallel!
Logic Apps Run Monitoring
Another handy improvement to the visualization of your Logic Apps is the new runtime monitoring visualization provided in the portal. Instead of seeing a listing of each action in your flow alongside their statuses – with tens of clicks involved in taking in the full state of the flow at any given time – a brand new visualizer can be used to see everything in one shot.
The visualization captures essentially the same thing that you see in the Logic App designer, but shows both the inputs and the outputs on each card along with a green check mark (Success), red X (Failure), or gray X (skipped) in the top-right corner of the cards.
Additionally if you have a for each loop within your flow, you can actually drill into each iteration of the loop and see the associated inputs/outputs for that row of data.
Visual Studio Designer
There is one feature that you won’t see in the Azure portal. In fact, it’s designed for offline use – the Visual Studio designer for Logic Apps. The designer can be used to edit those Logic App definitions that you’d rather manage in source control as part of an Azure Resource Group project – so that you can take advantage of things like TFS for automated build and deploy of your Logic Apps to multiple environments
Unfortunately, at the moment you will not experience feature parity with the Azure Portal (i.e., it doesn’t do scopes or loops), but it can handle most needs and sure is snappy!
That being said, do note that at the moment, the Visual Studio designer is still in preview and the functionality is subject to change, and might have a few bugsies still lingering.
Much More
These are just a few of the features that stick out immediately while using the GA version of the product. However, depending on when you last used the product, you will find that there are lots of runtime improvements and expanded capabilities as well (e.g., being able to control the parallelism of the for each loops so that they can be forced to execute sequentially).
Be Prepared
So how can you be prepared to take your integrations to the next level? Well, I’m actually in the middle of teaching all of these things right now in QuickLearn Training’s Cloud-based Integration using Logic Apps class, and in my humble and biased opinion, it is the best source for getting up to speed in the world of build cloud integrations. I highly recommend it. There’s still a few slots left in the September run of the class if you’re interested in keeping up with the cutting edge, but don’t delay too long as we expect to see these classes fill up through the end of the year.
As always, have fun and do great things!
by Rob Callaway | Jun 9, 2016 | BizTalk Community Blogs via Syndication
One of the concerns that I have repeatedly heard from customers when we talk about Azure is application lifecycle management. If you do most of your resource deployment and management using the Azure Portal, then you probably picture a very manual migration process if you wanted to move your app from dev to test, or if you wanted to share your app with another developer.
A clear example of this occurred during a run of QuickLearn’s Cloud-Based Integration Using Azure App Service course when my students were quick to see that the Logic Apps they created was pretty much stuck where they created them. Moving from one resource group to another was impossible at the time, and exporting the Logic App (and all the API Apps it depended on) was only a dream, so the only option was to redo all your work in order to create the Logic App in another resource group or subscription.
Logic Apps and Azure App Service have come a long way since then and the QuickLearn staff has been working its collective noodle to come up with application lifecycle management guidance for Logic Apps using the tools that are available today, which will hopefully improve the way you go about deploying and managing your Logic Apps.
Some readers may already be aware of the Azure Resource Manager or ARM for short. For those who haven’t previously met my little friend I’ll give a short introduction of ARM and the tools that exist around it. ARM is the underlying technology that the Azure Portal uses for all its deployment and management tasks. For example, if you create any resource within a new Resource Group using the Portal it’s really ARM behind the scenes orchestrating the provisioning process.
“Great Rob, but why do I care?”
I’ll tell you why. There are tools designed around ARM that make it not only possible, but down-right easy to run ARM commands. For example, you can get the Azure PowerShell module or the Azure Command Line Interface (CLI) and script your management tasks.
There’s a little more to it though, you see, those Azure resources (Logic Apps, Resource Groups, Azure App Service plans, etc.) are complex objects. Resource Groups, for example, have dozens of configurable properties and serve as containers for other objects (e.g., Web Sites, API Apps, Logic Apps, etc.). Let’s not over simplify reality; your cloud applications aren’t made up of a single resource, but instead are many resources that work in tandem. Therefore, any deployment or management strategy needs to bear that in mind. If you want to pull back the covers on your own resources, head over to the Azure Resource Explorer and you’ll see what I’m talking about.
“It’s nice to have a command that I can run in a console window to create a Resource Group, but I need more than that!”
You’re right. You do need more than that. The way you get more is using ARM Templates. ARM Templates provide a declarative way to define deployment of resources. The ARM Template itself is a JSON file that defines the structure and configuration of one or more Azure resources.
“So how I do I get one of these templates?”
There are several ways that you can get your hands on the ARM Template that you want.
- Build it by hand – The template is a JSON file so I guess if you understand the schema of the JSON well enough you could write an ARM Template using Notepad, Kate, or Visual Studio Code. This doesn’t seem very practical to me.
- Use starter templates – The Azure SDK for Visual Studio includes an Azure Resource Group project type which includes empty templates for an array of Azure resources. These templates are actually retrieved from an online source and can be updated at any time to include the latest resources. This looks a lot more viable than using Notepad, but in the end you are still modifying a JSON file to define the resource that you want.
- Export the template – You can export existing resources into a new ARM Template file. The process varies slightly from one type of resource to the next but you essentially go to the resource in the Azure Portal and export the resource to an ARM Template file. Sadly, at the time this article is being written this is not supported for Logic Apps, but Jeff Hollan has a custom PowerShell cmdlet that he built to export a Logic App to an ARM Template file.
One more thing — these templates are designed to utilize parameter files, so any aspect of the resource you’re deploying could be set at deploy-time via a parameter in a parameter file. For example, the pricing tier utilized by your App Service plan might be Free in your development environment and Standard in your test environment. The obvious approach is to create a different parameter file for each environment or configuration you want to use.
“I see what you did there… So now what?”
Well, now you’ve got your template and a way to represent the differences in environments as your application flows through the release pipeline, and you have an easy and repeatable way to deploy your resources wherever and whenever you want. The only piece that’s missing are the tools to perform the deployment.
As mentioned above, you could use the Azure PowerShell tools or Azure CLI to create scripts that you manually execute. Those Visual Studio ARM Template projects even include a pre-built PowerShell script that you could execute.
Personally, I love automation but I’ve never been a big fan of asking a person to manually run a random script and feed it some random files. I want something that’s more streamlined. I want something that is simultaneously:
- Automated – The process once triggered should not require manual help or intervention
- Controlled – The process should accommodate appropriate approvals along the way if needed
- Consistent and Repeatable – The process should not vary with each execution; it should have predictable outcomes based on the same inputs
- Transparent – The whole team should have visibility into the deployments that have taken place, and be able to identify which versions of the code live where, and why (i.e., I should have work item-level traceability)
- Versioned – Changes within the process and/or the process inputs (i.e., Logic App code) should be documented and discoverable
- Scalable – It should be just as easy to deploy 20 things as it is to deploy 1 thing.
For the past few years my team has been using TFS / VSTS as our primary source control and project management tool. In that time we’ve become more reliant on the excellent build system (Team Foundation Build) that TFS offers.
Team Build is much more than a traditional local build using Visual Studio. Team Builds run on a build server (i.e., not on your local computer) and are defined using a Build Definition. The Build Definition is a declarative definition of both the process that the build server will execute, as well as the settings regarding how the build is triggered, and how it will execute. It’s essentially a workflow for preparing your application for deployment.
The Build Definition is made up of tasks. Each task performs a specific step required in the build process. For example, the Visual Studio Build task is used to compile .NET projects within Visual Studio Solutions, and within the step you can control the Platform (Win32, x86, x64, etc.), and the Configuration (debug or release). While the Xamarin.Android task is used for compiling Android applications with settings appropriate for them.
Build Definitions can have Tasks that do more than compile your code. You might include tasks to run scripts, copy files to the build server, execute tests (Load Tests, Web Performance Tests, Unit Tests, Coded UI tests etc.), or create installation packages (though this would generally just be done through another project in your solution [e.g., with Flexera InstallShield and/or the WiX Toolset]). This gives you the power to quickly and automatically execute the tasks that are appropriate for your application.
Furthermore, a single Team Project in TFS could have multiple build definitions associated with it; because sometimes you want the build to simply compile, but other times you want to burn down the village, compile, run tests, and then deploy your web site to Azure for manual testing. Or perhaps you’re managing builds for multiple feature branches or even multiple applications within the Team Project.
“So what does this have to do with Logic Apps?”
If I add one of those ARM Template Visual Studio projects to my TFS / VSTS source control repository (whether it’s a Git repository or TFVC), I can create a Build Definition that compiles the ARM Deployment Project and other Visual Studio projects that include resources used by my cloud application (e.g., custom API Apps, Web Sites, etc.), and then publishes the ARM Template files (templates and parameter files) to a shared location where they can be accessed by automated deployment processes.
This was surprisingly easy to set up, I think it only took about 5 minutes. The best part is I can have this build trigger on check-in, so my deployment files are always up-to-date.
Here’s what my Build Definition looks like:
First I compile the project.
Then I copy the ARM Template files and parameter files from the build output directory to a temporary file location.
Finally, I publish the files from the temporary location. I’m using a Server location that other steps in the build (or a Release Manager release task) could use. It could have also been a file share to give access to processes not hosted in TFS.
“So what does all this add up to?”
Whenever someone changes the ARM Deployment project (whether modifying the template or parameters file or adding a new template/parameter file to it) Team Build runs my Build Definition to: (1) compile my project, (2) extract the ARM deployment files from the build directory, and (3) publish the files as an Artifact named templates. That Artifact lives on the build server and can be accessed by VSTS Release Management release tasks that will actually deploy my Azure resources to the cloud.
Release Management (a component of TFS / VSTS) helps you automate the deployment and testing of your software in multiple environments. You can either fully automate the delivery of your software all the way to production, or set up semi-automated processes with approvals and on-demand deployments.
In Release Management, you create Release Definitions that are conceptually similar to build definitions. A Release Definition is a declarative definition of the deployment process. Just like a Build Definition, a Release Definition is composed of tasks and each task provides a deployment step. The primary input for a Release Definition is one or more Artifacts created by your Build(s).
Release Definitions add a couple extra layers of complexity. One of those layers is the Environment. We all know that release pipelines are made up of multiple environments, and often each environment will come with its own unique requirements and/or configuration details. Within a single release definition you can create as many environments as you want, and then configure the Tasks within a given environment as appropriate for that system. The various Environments in you Release Definition can have similar or different Tasks
Each environment can also utilize variables if you’d prefer to avoid hard-coding things that are subject to change.
In this simple example, I created a Release Definition with two environments: Development and Test. Within each environment I used the Azure Resource Group Deployment task to deploy my Logic App, Service Plan, and Resource Group as defined in my ARM Deployment Template JSON file.
I configured the deployment to Development to happen automatically upon successful build (remember the build runs when I check-in the source code). But I wanted Test deployments to be manual.
I also created variables that enabled me to parameterize the name of the Resource Group, and the name of the Parameter File to use in each environment.
You can see here how I’m using those variables within the Azure Resource Group Deployment task.
Of course it works.
If I go to my Visual Studio project and modify something about my Logic App template. Maybe I finally get around to fixing that grammatical error in my response message.
Then I check-in my changes.
In VSTS, I can see that my build automatically started.
After the build completes, in the Release Hub I can see that a new release (Release-4) using the latest build (13) has started deploying to the Development environment.
I’ve got logs to show me what happened during the deployment.
I can see the commits or changesets included in this release compared to earlier releases. So a month from now Nick can see what modifications were deployed in Release-4.
What’s going on in Azure though? It looks like the Logic App in the Development Resource Group was updated to match my changes.
But my Test environment wasn’t touched.
Over on the Release Hub, I can manually start the Deployment to Test.
I almost forgot, deploying to Test requires an approval as well.
Just like that, it’s done.
In about 30 minutes I was able to create a deployment pipeline for my Logic App. The deployment pipeline is flexible enough that changes can be made easily, but structured in a way that I (and everyone else on my team) can see exactly what it does.
QuickLearn Training offers courses to enhance your understanding of TFS / VSTS and Logic Apps. Our Build and Release Management Using TFS 2015 course has all the finer details that you’ll never get out of a blog article, and our Cloud-Based Integration Using Azure App Service course teaches you how to build enterprise-ready integration solutions using features of the Azure cloud.