Integrate 2017 USA Day 2 Recap

Integrate 2017 USA Day 2 Recap

And so the second day of Integrate 2017 USA is a fact, another day of great sessions and action packed demos. We started the day with Mayank Sharma and Divya Swarnkar from Microsoft CSE, formerly Microsoft IT, taking us through their journey to the cloud. Microsoft has an astounding amount of integrations running for all their internal processes, communicating with 1000+ partners. With 175+ BizTalk servers running on Azure IaaS, doing 170M+ messages per month, they really need a integration platform they can rely on.

10_26_17 08_40 Office Lens

Like most companies, Microsoft is also looking into ways to modernize their application landscape, as well as to reduce costs. To accomplish this, they now are using Logic Apps at the heart of all their integrations, using BizTalk as their bridge to their LOB systems. By leveraging API Management they can test their systems in production as well as in their UAT environments, ensuring that all systems work as expected. By using the options the Azure platform provides for geo replication they ensure that even in case of a disaster their business will stay up and running.
10_26_17 08_44 Office Lens (2)

Adopting a microservices strategy, each Logic App is set up to perform a specific task, and meta data is used to execute or skip specific parts. To me this seems like a great setup, and definitely something to look into when setting up your own integrations.

10_26_17 08_56 Office Lens

Manage API lifecycle sunrise to sunset with Azure API Management

The second session of the day we had Matthew Farmer and Anton Babadjanov showing us how we can use API Management to set up an API using a design first approach. Continuing on the scenario of Contoso Fitness, they set up the situation where you need to onboard a partner to an API which has not been built yet. By using API Management we can set up a façade for the API, adding it’s methods and mock responses, allowing consumers to start working with the API quickly.

10_26_17 09_15 Office Lens

Another important subject is how you handle new versions of your API. Thanks to API Management you can now have versions and revisions of your API. Versions allow you to have different implementations of your API living next to each other publically available to your consumers, where revisions allow you to have a private new version of your API, in which you can develop and test changes. Once you are happy with the changes done in the revision, you can publish it with a click of the button, making the new revision the public API. This is very powerful, as it allows us to safely test our changes, and easily roll back in case of any issues.

10_26_17 09_38 Office Lens

Thanks to API Management we have the complete lifecycle of our API’s covered, going from our initial design, through the ALM story, all the way up to updating and deprovisioning.

10_26_17 09_49 Office Lens

Azure Logic Apps – Advanced integration patterns

Next up are Jeff Hollan and Derek Li, taking us behind the scenes of Logic Apps. Because the massive scale these need to run on, there are many new challenges which needed to be solved. Logic Apps does this by reading in the workflow definition, and breaking it down into a composition of tasks and dependencies. These tasks are then distributed across various workers, each executing their own piece of the tasks. This allows for a high degree of parallelism, which is why they can scale out indefinitely. Having this information, it’s important to take this with us in our scenarios, thinking about how this might impact us. This includes keeping in mind tasks might not be processed in order, and at high scale, so we need to take this into account on our receiving systems. Also, as Logic Apps provides at-least-once delivery, so we should look into idempotency for our systems.

10_26_17 10_02 Office Lens

Derek Li showed us different kinds of patterns which can be used with Logic Apps, including parallel processing, exception handling, looping, timeouts, and the ability to control the concurrency, which will be coming to the portal in the coming week. Using these patterns, Derek created a Logic App which sent out an approval email, and by adjusting the timeout and setting up exception handling on this, escalating to the approver’s manager in case the approval was not processed within the timeout. These kinds of scenarios show us how powerful Logic Apps has become, truly allowing for a customized flow.

10_26_17 10_20 Office Lens

Bringing Logic Apps into DevOps with Visual Studio and monitoring

After some mingling with like minded people during the break at Integrate 2017 USA, it’s now time for another session by Kevin Lam and Jeff Hollan, which is always a pleasure to see. In this session we dive into the story around DevOps and ALM for Logic Apps. These days Logic Apps is a first class citizen within Visual Studio, allowing us to create and modify them, pulling in and controlling existing Logic Apps from Azure.

10_26_17 11_01 Office Lens

As the Logic Apps designer creates ARM templates, we can also add these to source control like VSTS. By using the CI/CD possibilities of VSTS, we can then automatically deploy our Logic Apps, allowing for a completely automated deployment process.

10_26_17 11_30 Office Lens

Integrating the last mile with Microsoft Flow

As pro-integrators Microsoft Flow might not be the first tool coming to mind, but actually this is a very interesting service. It allows us to create light weight integrations, giving room to the idea of democratization of integration. There is a plethora of templates available for Flow, allowing users to easily automate tasks, accessing both online and on-premises data. With custom connectors give us the option to expose any system we want to. And being categorized in verticals, users will be able to quickly find templates which are useful for them.

10_26_17 11_50 Office Lens

Flow also has the option to use buttons, which can be both physical buttons, from Flic or bttn, or programmatic in the Flow app. This allows for on-demand flows to be executed by the click of a button, and sharing these within your company. For those who want more control over the flows that can be built, and the data that can be accessed, there is the Admin center, which is available with Flow Plan 2.

10_26_17 12_20 Office Lens

Looking at the updates which has happened over the last few months, it’s clear the team has been working hard, making Flow ready for your enterprise.

And even more great things are about to come, so make sure to keep an eye on this.

10_26_17 12_27 Office Lens

Deep dive into BizTalk technologies and tools

Yesterday (on Day 1 on Integrate 2017 USA), we heard the announcement of Feature Pack 2 for BizTalk. For me, one of the coolest features that will be coming is the ability to publish BizTalk endpoints through Azure API Management. This will allow us to easily expose the endpoint, either via SOAP pass-through or even with SOAP to REST, and take advantage of all the possibilities API Management brings us, like monitoring, throttling, authentication, etc. And all this, with just a right click on the port in BizTalk, pretty amazing.

10_26_17 13_36 Office Lens

As we had seen in the first session of today, Microsoft has a huge integration landscape. With that many applications and artifacts, migration to a new BizTalk version can become quite the challenge. To overcome this, Microsoft IT created the BizTalk Server Migration Tool, and published the tool for us as well. The tool takes care of migrating your applications to a new BizTalk environment, taking care of dependencies, services, certificates and everything else.

10_26_17 13_56 Office Lens

Looking at the numbers, we can see how much effort this saves, and minimizing the risks of errors. The tool supports migration to BizTalk 2016 from any version from BizTalk 2010, and is certainly a great asset for anyone looking into migration. So if you are running an older version of BizTalk, remember to migrate in time, to avoid running out of the support timelines we have seen yesterday.

10_26_17 13_58 Office Lens

What’s there & what’s coming in BizTalk360

Next up we had Saravana Kumar, CEO of BizTalk360 and founding father of Integrate, guiding us through his top 10 features of BizTalk360. Having worked with the product since its first release, I can only say it has gone through an amazing journey, and has become the de-facto monitoring solution for BizTalk and its surrounding systems. It helps solving the challenges anyone who has been administrating BizTalk, giving insights in your environment, adding monitoring and notifications, and giving fine-grained security control.

10_26_17 14_21 Office Lens

So Saravana’s top 10 of BizTalk360 is as following, and I pretty much agree on all of them.

1. Rich operational dashboards, showing you the health of your environment in one single place

2. Fine grained security and auditing, so you can give your users access to only those things they need, without the need to opening up your complete system

3. Graphical message flow, providing an end to end view of your message flows

4. Azure + BizTalk Server (single management tool), because Azure is becoming very important in most integrations these days

5. Monitoring – complete coverage, allowing us monitor and, even more importantly, be notified on any issue in your environment

6. Data (no events) monitoring, giving us monitoring on what’s not happening as well, for example expected messages not coming in

7. Auto healing – from failures, to make sure your environment keeps running, automatically coming back up after issues, either from mechanical or human causes

8. Scheduled reporting, which will be coming in the next version, creating reports about your environment on a regular basis

9. Analytics & messaging patterns, giving even more insights in what is happening using graphical charts and widgets

10. Throttling analyser, because anyone who has ever needed to solve a throttling problem knows how difficult this can be, having to keep track of various performance counters, this feature allows a nice graphical overview and historical insights

11. Team knowledgebase, so one more bonus feature that should really be addressed, the knowledgebase is used to link articles to specific errors and faults, making sure this knowledge is readily available in your company

Of course, this is not all, BizTalk360 has a lot more great features, and I can recommend anyone to go and check it.

10_26_17 14_50 Office Lens

Give your Bots connectivity, with Azure Logic Apps

Kent Weare, former MVP and now Principal Program Manager within Microsoft on Flow team, takes us on a journey into bots, and giving them connectivity with Logic Apps and Flow. First setting the stage, we all have heard about digital transformation, but what is it all about? Digital transformation has become a bit of a buzzword, but the idea behind it is actually quite intriguing, which is using digital means to provide more value and new business models. The following quote shows this quite nicely.

“Digital transformation is the methodology in which organizations transform, create new business models and culture with digital technologies” – Ray Wang, Constellation Research

An important part here is the culture in the organization will need to change as well, so go out and become a change agent within your organization.

Next we go on to bots, which can be used to reduce barriers and empower users through conversational apps. With the rise of various messenger applications like WhatsApp and Facebook Messenger, there is a huge market to be reached here. There are many different kinds of bots, but they all have a common way of working, often incorporating cognitive services like Language Understanding Intelligence Service (LUIS) to make the bot more human friendly.

10_26_17 15_10 Office Lens (1)

When we want to build our own bots, we have different possibilities here as well, depending on your background and skills. Kent had a great slide on this, making it clear you don’t have to be a pro integrator anymore to make compelling bots.

10_26_17 15_13 Office Lens (1)

In his demos, Kent showed the different implementations on how to build a bot. The first is using the bot framework with Logic Apps and Cognitive Services to make a complex bot, allowing for a completely tailored bot. For the other two demos, he used Microsoft flow in combination with Bizzy, a very cool connector which allows us to create a “question-answer bot”, analyzing the input from the user and making decisions on it. Finally the ability to migrate Flow implementations to Logic Apps was demonstrated, allowing users to start a simple integration in Flow, but having the ability to seamlessly migrate these to Logic Apps when more complexity is needed over the lifecycle of the integration.

10_26_17 15_47 Office Lens

Empowering the business using Logic Apps

And closing this second day of Integrate 2017 USA, we had Steef-Jan Wiggers, with a view from the business side on Logic Apps. A very interesting session, as instead of just going deep down into the underlying technologies, he actually went and looked for the business value we can add using these technologies, which in the end is what it is all about. Serverless integration is a great way to provide value for your business, lowering costs and allowing for easy and massive scaling.

10_26_17 16_25 Office Lens

Steef-Jan went out to several companies who are actually using and implementing Azure, including Phidiax, MyTE, Mexia and ServiceBus360. The general consensus amongst them, is Logic Apps and the other Azure services are indeed adding value to their business, as it gives them the ability to set up new powerful scenarios fast and easy.

10_26_17 16_29 Office Lens

With several great demos and customer cases Steef-Jan made very visible how he has already helped many customers with these integrations to add value to their business. The integration platform as a service is here to stay, and according to Gartner iPaaS will actually be the preferred option for new projects by 2019. And again, he has gone out and this time went to the community leaders and experts, to get their take on Logic Apps. The conclusion here is these days Logic Apps is a mature and powerful tool in the iPaaS integration platform.

10_26_17 16_43 Office Lens

So that was the end of the second day at Integrate 2017 USA, another day full of great sessions, inspiring demos, and amazing presenters. With one more day to go, Integrate 2017 USA is again one of the best events out there.

Check out the recap of Day 1 and Day 3 at Integrate 2017 USA.

Author: Eldert Grootenboer

Eldert is a Microsoft Integration Architect and Azure MVP from the Netherlands, currently working at Motion10, mainly focused on IoT and BizTalk Server and Azure integration. He comes from a .NET background, and has been in the IT since 2006. He has been working with BizTalk since 2010 and since then has expanded into Azure and surrounding technologies as well. Eldert loves working in integration projects, as each project brings new challenges and there is always something new to learn. In his spare time Eldert likes to be active in the integration community and get his hands dirty on new technologies. He can be found on Twitter at @egrootenboer and has a blog at http://blog.eldert.net/. View all posts by Eldert Grootenboer

Integrate 2017 USA Day 1 Recap

Integrate 2017 USA Day 1 Recap

And we’re off, the USA leg of the Integrate conference started today in Building 92 on the Microsoft campus in Redmond.

Saravana kicked off proceedings by setting the scene and giving us an indication of who we’re going to see over the next two and a half days.

speakers

It’s a good line up with speakers from the Microsoft integration teams and some great community speakers.

There was a shout out for Integration Monday and Middleware Friday, two awesome community efforts supported by Saravana and BizTalk360.

Saravana was followed by Duncan Barker from BizTalk360 who explained that BizTalk360 has now grown to 50 people and spoke about ServiceBus360 and how that has grown and continues to be developed.

Duncan also teased about 2 new products that are coming in 2018 so that’s definitely something to look out for and mentioned that work is already underway for Integrate 2018 so watch your mailboxes for more information on that as the plans begin to take shape.

With the introductions and scene setting done, it was time for the leader of the awesome integration team to take the stage.

Jim Harrer – Limitless Possibilities with Azure Integration Services

Jim’s message was very much one of integration being the connective tissue that all solutions need to tie things together, reinforcing that there is ongoing investment in BizTalk Server and the story that Logic Apps and BizTalk Server are Better Together.

bettertogether

With over 180 connectors now in Logic Apps, including many that integrate directly with Azure Services, it is possible to more effectively build integration solutions that span on-premises and cloud and really accelerate adoption through hybrid integration, and taking an API first approach is a great way to unlock business value.

Jim then moved on to serverless, a platform that is just there ready for you to use when you need it.

With serverless, you get improved build and delivery, reduced time to market and per action billing and it really flips traditional development on its head.

The Pro Integration team has had a busy year, and this was shown in a single slide.

jimyearinreview

This shows just how quickly things are changing and evolving and has included things like Logic Apps going GA, feature packs being introduced for BizTalk and API Mocking which has allowed teams to be more agile and progress at greater speed, making it possible to deliver integration solutions in weeks rather than months.

This agility has led to integration getting a seat at the table instead of being an afterthought.

We then had some great demos from Jon Fancey, Kevin Lam and Jeff Hollan who introduced the demo scenario that would be used throughout the conference, Contoso Fitness.

contosofitness

Jon kicked off the demos with a Logic App calling Spotify. This allowed him to show the new Custom Connector and a great resource, https://apis.guru/browse-apis/.

Kevin followed up looking at Azure Security Center and showed the tooling that was introduced at Microsoft Ignite recently. This provides integration directly between Azure Security Center and Logic Apps, including playbooks that are templates which integrate directly into typical service management tools such as Service Now.

Jeff did the last demo on Logic Apps and Cognitive Services. This showed the power of using the Video Indexer API and the ability to spin up a Docker container through a connector that will be released shortly. This container used FFMPEG, an open source tool, to take the transcript generated by the indexer and apply the information as subtitles in the video.

We finished with Jim urging everyone to maximise the value of their projects using integration.

jimmaximise

Final message:

“Now is the time for integrators to unlock the impossible”

Paul Larsen – BizTalk – Connecting line-of-business applications across the Enterprise

Paul opened his presentation with a great image of a green screen, a mainframe that is running on campus.

paulgreen

This set the scene for a great presentation and dive into BizTalk and heritage systems. Paul insisted on calling them heritage rather than legacy, as heritage is something you celebrate and love whilst legacy has a number of negative connotations!

Paul again emphasized the importance of hybrid integration between BizTalk and the cloud, and the message really started resonating. He spent some time positioning BizTalk and how it had changed along with Host Integration Server over the years he has been on the team.

For me, his demo involving Contoso Fitness showcasing mobile applications, Logic Apps, virtual machines, HL7 and a mainframe was one of the best of the day. It showcased hybrid integration with the Logic Apps adapter, and the real breadth and depth of the Microsoft integration story.

Paul explained the reasoning behind the Feature Pack releases, how it was able to deliver new value at a quicker cadence by introducing non-breaking changes and he reviewed what had been delivered in Feature Pack 1.

biztalkfp1

The information was split between Deployment – application lifecycle management; Runtime – advanced scheduling, SQL encryption columns and web admin; and Analytics – AppInsights for tracking and the Power BI template.

He then mentioned that Feature Pack 2 would be released next month!

biztalkfp2

Splitting the information the same way we had Deployment – application lifecycle management for multiple servers and backup to Blob Storage; Runtime – Adapter for Service Bus v2, TLS 1.2 (although this may be in the next Cumulative Update as it is a critical update), using API Management to expose Orchestration endpoints, and sending/receiving from Event Hub; Analytics – sending data to Event Hub for tracking.

He walked through the BizTalk Server Support Lifecycle.

biztalklifecycle

This shows that BizTalk Server 2013/2013 R2 is out of mainstream support in 9 months and that people should at least starting thinking about migrating. NOTE: A cool tool to help with this migration was presented by Microsoft IT on Day 2 and is available for use.

The most important slide was the BizTalk Roadmap.

biztalkroadmap

This clearly shows an ongoing commitment to the product with a timeline for CUs, Feature Packs, and BizTalk vNext.

With that Paul wrapped up we had a break followed by Jeff and Kevin.

Jeff Hollan/Kevin Lam – Azure Logic Apps – build cloud-scale integrations faster

You always know you’re in for a great session when these two stand up, and this session did not disappoint.

It was aimed a level setting session to get people across Logic Apps, what they are and why you’d use them.

jeffalapower

To help emphasize the growth of the service, Kevin mentioned that at GA in June 2016 there were about two dozen connectors, now there are nearly 200!

Connectors provide a canonical form for integration that scale to meets the needs of the customer.

alaconnectors

A slide was shown that had an animation of the current connectors that went on for a few pages and included colours to indicate connectors to Azure Services (blue) and those to other Microsoft services (orange), along with a list of others really showing how much coverage Logic Apps has.

One of the new features was shown – custom connectors.

alacustom

Custom connectors are available now and treated just the same as any other connector, including storing secrets in the Logic Apps secret store just like regular connectors.

These conferences are great on their own, but when the teams share what’s next and any roadmap information it is particularly interesting. With that, we were teased with what connectors and services are coming soon.

alaconnectorsnext

These include the ability to initialize and destroy containers within the Azure Container Service, Oracle EBS and high availability for the on-premises data gateway. I am particularly interested in the container story and can see this as a great way of running transient compute workloads easily and only when required.

We then moved on to more level setting and to how agile the Logic Apps team is, highlighted by a slide that showed what they have shipped this year, including Visual Studio tooling, nested foreach loops and Ludicrous Mode that allows sharding across the infrastructure to improve performance. Currently, the cadence is roughly a release every two weeks!

To highlight this agility, even more, they showed what was coming soon to the service.

alaagilenext

Particularly interesting is mocking testing to allow you to stub out connectors that are still being built, being able to resubmit from a failed action rather than an entire run, concurrency control to allow control of how parallel foreach loops run which can be important in ordered delivery scenarios and snippets which allow you to create some reusability across your Logic Apps.

The new pricing model that comes into effect on 1st November was shown. This has a 32x reduction in the cost of native actions, 6.5x reduction in the cost of standard connectors and bringing enterprise connectors inline with other connects based on pay per execution.

alapricing

The pricing changes also applied to integration account with them coming to a third of their previous price.

With that Jeff wrapped up with another great demo for Contoso Fitness showing how to integrate a Flic button to emulate a customer pushing a button on a fitness machine when it needed maintenance or cleaning, sending an alert via an HTTP trigger to ServiceNow.

We then had a change in presentation order, with Vlad and Miao covering API Management.

Vladimir Vinogradsky/Miao Jiang – Bolster your digital transformation with Azure API Management

Vlad provided a great overview of API Management and showed how APIs, in general, is really the common component of any solution, whether that is a Software as a Service product or the Internet of Things.

vladapis

He continued by explaining how API Management is positioned and how it can be used to drive loyalty, build new services and channels to market and how it can help cope with multi-speed IT where not every part of a solution or business wants the same pace of change.

vladstrategy

Vlad continued with a general overview of policies and how to use them to enforce certain things like access control and rate limits, and how you can chain them together by explaining the scope and the cumulative nature of policies.

vladpolicy

After a discussion about security, the conversation moved on to the inclusion of VNets to help control access to on-premises APIs and then multi-region support and scaling that is available as a premium feature. This allows

you to deploy units of scale across regions, includes request caching out of the box, allows incremental growth of APIs, and allows different scales in different regions. It is a great way to grow your APIs as your business grows.

Miao then did a great demo, showing the key features of the service, firstly showing how to create an API, including SOAP to REST to allow more modern access to heritage APIs.

Using the Developer Portal to allow testing of the APIs he showed how to apply a number of policies such as removing headers, replacing backend URLs and rate limiting, followed by using the tracing feature to gain insight into the information passed to and from an API call and what policies are applied.

Any enterprise solution requires in-depth insight, so Miao moved on to monitoring and using Metrics in the Azure Portal to set alerts and using it to call a Logic App followed by the Diagnostic settings and Log Searching.

We then moved to looking at the new Power BI template that can be deployed with a single click.

vladpbi

This looks like a great way of delivering insight into an API Management deployment and has been created based on customer asks. It uses Event Hubs, Stream Analytics, and SQL Database.

After a slide that showed the growth in API Management, Vlad then showed how much work has been done in the last 12 months.

vladvelocity

Like the other presentations, this shows just how agile and engaged the team is and how they are really delivering value to us as users of their service.

With that Vlad provided a list of resources and closed out the morning session.

After lunch, we had 3 presentations on the messaging services within Azure that took proceeding up to the afternoon break.

Dan Rosanova – Messaging yesterday, today and tomorrow

After lunch, Dan kicked off sessions about the messaging services in Azure starting with his own presentation about tools and how Microsoft is really a tools company.

danhammer

Using a hammer for illustration, Dan gave a great presentation on where a hammer is a right tool and where a hammer is not. This included an unusual demo that showed how to open a beer with a hammer live on stage!

danbeerdemo

And really that was the main thrust of the presentation, that with Azure messaging being such a large set of tools, it is important to choose the right tool for the job.

danbesttool

To further hammer home the point, he talked about 3 scenarios to fit these tools:

  • Task Queue using a Storage Queue to coordinate simple tasks across compute
  • Big data streaming using Event Hub to flow and process data and telemetry in real-time
  • Enterprise Messaging using Service Bus to manage business process state transitions
  • Eventing using Event Grid to provide a reactive programming model

Dan summed up by saying that Event Grid will be GAed soon and indicated that some new services outside Azure are coming.

Shubha Vijayasarathy – Azure Event Hubs: the world’s most widely used telemetry service

Shubha set the scene using a big data scenario and how Event Hub can be used to provide a single service solution to common problems around telemetry and data pipelines.

shubhascenario

She moved on to how Event Hub answers all the typical questions asked about big data solutions, such as how do you handle data that has velocity, volume, and variety, can you deal with regional disasters and do real-time streaming as well as batch capture, what can Event Hub integrate with, and how can you handle support. Again, for any production solution, it is important to be able to lift the covers and see what is happening and how a solution is performing.

Shubha did a great demo showing how to use Event Hubs and Event Grid to move stream data into SQL Datawarehouse using the Capture feature of Event Hub that allows you to persist the telemetry data into a storage account. This demo used an Azure Function to react to an Event Grid event that was fired due to a storage file being created to process data into SQL DW.

Leaving the best until last Shubha gave some indication of what was coming soon from the team.

shubhasoon

This includes the general availability of Geo DR, IP filtering and VNet support and a portal experience for creating dedicated clusters.

We then had a bonus session for the day that was not scheduled.

Christian Wolf – Azure Service Bus: Who doesn’t know it?

So Dan covered the messaging services available, Shubha covered Event Hubs and Christian on to cover one of the oldest services in Azure.

This was a shorter but highly focussed session that started with what is new and soon to be released in Service Bus.

christiannew

He went through the important points of the slide, including that the Event Grid scenario is for lower volumes and not millions of messages. They are introducing Geo DR for Service Bus that will allow you to pair 2 independent namespaces and access them through an alias. NOTE: In this first release it is only metadata that is failed over between regions, not the data that is on any Service Bus asset.

A good point was made about the .NET Standard Client. It has been breaking changes, so Christian urged anyone wanting to adopt it to spend time in the release notes and testing.

Christian then did a couple of good demos, the first using Service Bus, and Event Grid to simulate Clemens Vasters wanting to buy an airplane (so a likely scenario!), and using Dynamics 365 to react to a new sales opportunity. The second demo showed the Geo DR capabilities and showed that monitoring not entirely straightforward. Christian used ServiceBus360 to help drive demo.

Christian finished with what’s next for Service Bus.

christiannext

This includes a capability to allow migration between standard and premium SKUs, a new management library, the introduction of throttling in the standard SKU, which is not dedicated, to eliminate noisy neighbours and Data DR as a broader part of the disaster recovery strategy.

This led to the final break of the day, with 2 more presentations standing between attendees and Scott Guthrie’s keynote.

Eduardo Laureano – Azure Functions – Serverless compute in the cloud

We started with an overview of Functions and the components of the service.

edfunction

Eduardo explained how Functions evolved, it came from App Service so HTTP has always been a native part of the service.

Eduardo showed the bindings and triggers, ut directed people to the documentation for an up-to-date list.

Following up with a discussion about developer tooling the discussion then turned to Functions by the numbers. The key takeaway from that was when customers go to Functions they are continuing to move more things over time as they evolve their ecosystems.

Eduardo did a demo that really showed the power of bindings by walking through the Function creation process for a Blob Storage trigger, performing a simple file upload, changing the input from Stream to byte[] and showing that it just still works exactly the same way.

After speaking about the difference between Function bindings and Logic Apps connectors (low code v no code, 23 bindings v 180+ connectors, ideal for data flow v ideal for workflow orchestration, data type in code v fully managed) Eduardo explained that as Functions is open source, anyone can go and create a new custom binding, and that he’d be happy to discuss having more community contributed bindings in the service.

We then moved on to the new Microsoft Graph binding announced at Ignite.

edgraph

This provides a way of finding correlations across different data sets, but the real magic is that it incorporates identity so you don’t have to.

We had 2 demos, the first showing the Graph binding, and the second showing the new Excel binding with data being added to an Excel file.

Proxies is a recently added feature that will be going GA soon, so Eduardo spent some time explaining how it works and did a great demo showing how you can use proxies for URL redirection and mocking of responses since you can specify a response payload. He then gave some scenarios where you may want to use proxies.

edproxies

Like most of the presentations during the day, he finished with a list of takeaways and resources.

The final presentation of the day before Scott was delivered by Jon Fancey.

Jon Fancey – Enterprise Integration with Logic Apps

Jon started by level setting and explaining that the Integration Account in Azure is the basic unit of work for Enterprise Integration.

jonintacct

He explained about the XML and B2B capabilities that are provided with the Integration Account and talked about DR scenarios which are important to consider as Integration Accounts hold stateful information. DR is achieved by having a Primary and (multiple) Secondary Integration Accounts in different regions, and the service uses Logic Apps to keep Integration Account states in sync.

Jon moved on to trading partner migration and a tool (TPM) that has been written to allow customers to easily move trading partners and agreements between BizTalk Server and Logic Apps.

jontpm

Jon gave an explanation of the traditional VETER pipeline and then moved to what is new in mapping.

jonmapping

With this, he introduced Liquid which allows mapping between different entity types using a DSL and did a demo of it using Visual Studio code.

After talking about the tracking features in Logic Apps, Jon gave us a glimpse of what was coming in Monitoring.

jonmonitoring

Key takeaways from this list are the OMS template and work around harmonizing the querying capabilities to bring it inline with AppInsights.

Jon did a demo to highlight these features showing OMS in the portal, drilling through the data, showing batch resubmit by looking at Runs and selecting, and tracked properties containing your own tracking, then showed taking a Query and creating a custom tile in the OMS workspace.

Next up for the “new” treatment was connectors.

jonconnectors

There had already been the discussion about custom connectors earlier in the day but it was great to see SOAP to REST, which shipped the same day, to allow even more opportunities to leverage current investments.

Time for another demo, this time looking at SOAP to REST using a custom connector. This was a great demo that involved Jon changing a SOAP app on the fly, adding a new custom connector, then running the service and a great He Man reference, “By the Power of GreySkull”, always a bonus!

Jon talked about the new batching feature and then gave us a view of what was new and coming.

jonbatch

The last demo of the day showed off the batching feature before Jon did a quick recap and showed some resources.

That was the end of the first day prior to the Keynote, and what a great day it was. There was plenty of information for people who had some knowledge but wanted to learn more and the presenters were very goofing at getting an idea of the level of the audience.

With great demos, great presentations and great presenters the conference got off to a real bang.

The only thing that was left after this was the man in the red polo shirt, but let’s cover that in its own post!

Integration 2017 USA Keynote

Integration 2017 USA Keynote

After a great start and some great content on Day 1 at Integrate 2017 USA it was time for the keynote, Jim Harrer returned to introduce the man known as ScottGu, the man in the red shirt.

Fresh off the back of his Red Shirt Tour, Scott Guthrie took to the stage to deliver a presentation to wrap up Day 1 of Integrate 2017 USA.

Integrate 2017 USA

He started by asking the question, what are the big opportunities for integration?

He echoed Jim’s sentiment earlier in the day that integration is the glue and an essential part of any enterprise solution.

As Integrate 2017 USA is a technical conference, he wanted to show some buzzwords and terms.

Integrate 2017 USA

Integration has a part to play everywhere, he said now is the time, time to build new things, new solutions.

Furthermore, he went as far as to say that integration is now transformational, creating new revenue streams and services, reinventing the way we do business, but security is critical.

We need to be using Azure to do things differently; in a productive way, a hybrid way, an intelligent way and a trusted way.

Integrate 2017 USA

He spoke about the reach of Azure with an unparalleled capability to reach a global audience with 42 Azure regions, providing a global reach for global business, and a great fact, 20% of all power for Ireland is used in North Europe data centre!

He showed a great video about what a data centre looks like which gave a glimpse in to just how impressive the Azure cloud is.

But Azure is also a Trusted Cloud.

Integrate 2017 USA

Azure has more certifications than any rival cloud provider, and provides a guarantee that regional data stays in the region and fails over across paired regions. Germany and China have specific data requirements so their data has even more protections.

Scott then shared that 90% of Fortune 500 companies use Azure.

Time for another video, this one on customers using Azure including Asos, Dominos, Rockwell Automation and Geico.

Scott calls out integration again as the enabler of all these scenarios, one of the most critical components of the overall stack that delivers the value.

Next up is a great summary slide that shows the technology at play in Microsoft Azure across tools, advanced workloads and core infrastructure.

Integrate 2017 USA

Integration is an important part of this, as is hybrid cloud.

Time for Scott to do some demos and showcase some integration scenarios.

The first demo is a backend solution driven by AI and data to create workflows. In this demo we see a Twitter analysis Logic App that monitors social networks, uses Cognitive Services to detect sentiment and perform key phrase extraction. The phrases are then analysed in a Function, and rows added to a PBI dataset.

The final part of the puzzle is sending a message to Microsoft Teams if the sentiment < 0.3 and creating a case in Dynamics 365 for support follow up.

This solution is live and used right now, Azure Support is currently able to reach out within three and a half minutes to follow up when negative sentiment is detected.

One curiosity was when showing Microsoft Teams, it clearly showed that not everything that seems negative is negative!

The second demo was based on a customer visit, a fitness organisation, and was built within the customer meeting and demonstrated Azure’s ability to solve real world problems and address pain points.

The demo used a PowerApp to take a picture, then use the Face API in Cognitive Services to do gym sign up and check in.

Now back to the slides, integration combined with other services provides much more possibility, and is better than a pure integration play.

Another quick demo showing how to create a new database, then talking about the capabilities of SQL Database including Point In Time recovery, and recommendations for optimising and auto tuning.

This was then extending to talk about SQL Injection and how the SQL Database service has threat analytics built in and how it can automatically block or take other actions as required when it perceives a threat.

Scott moved on to Virtual Machines and showed how to manage them, including the capability to manage multiple computers at once, you can look at Update management (patch management), and look at compliance of VM patching including Windows, Linux and non-Azure computers.

Using VM Inventory allows visibility on what VMs there are and their capabilities, then looking at Change tracking to show what has changed, files, registry settings, software and also supports managing multiple VMs at once making operators more efficient.

All of these demos really allow Scott to show that Azure has such a rich set of features with such a huge breadth.

Back to the core message to finish, integration is at the centre for connecting things together, and we can do it productively to deliver quickly, we can do it in a hybrid way to join cloud and on-premises, we can do it intelligently using AI and Cognitive Services and we can do it in a trusted way on a cloud that is compliant and secure.

With that Scott wrapped up and the first day concluded at Integrate 2017 USA, a day full of knowledge, information and humour.

Our experience at Design Thinking Summit 2017

Our experience at Design Thinking Summit 2017

Design Thinking –the name sounds different. Can we design our thinking? Yes, we can and this is what we learnt from the Design Thinking Summit 2017 which was held at IIM Bangalore. We are grateful to our organization for providing us such a wonderful opportunity to participate in this event.

BizTalk360 always focus on the motto “You grow, we grow, together we grow”. In this way, they always help the employees acquire skills through different learning and training programs. One such opportunity was given to 6 of us to attend the Design Thinking Summit 2017 and I am lucky to be the one among them. In this blog, I would like to share my experiences in the DTSummit. Special thanks to Saranya and Umamaheshwaran for adding more meaning to this blog by sharing their experiences.

Design Thinking Summit 2017

An intro to Design Thinking Summit – Insight:

Design Thinking is a creative, repeatable, human centered approach to problem-solving and innovation. It draws upon experience, imagination, intuition, and reasoning to explore possibilities of what could be—to create desired outcomes that benefit your customers. This summit was organized by a group called Pensaar, powered by a team of highly experienced design thinkers and problem solvers. Over the 3-day workshop conducted by Pensaar team, we learnt how to understand customers, articulate insights that will inspire innovation, ideate till you get disruptive ideas that we can rapidly test with customers. It is focused on learning by doing. All while experimenting, experiencing, having fun and being surprised. There were around 160 participants this year for the DTSummit.

Day 1 at the Design Thinking Summit:

It was all new for our BizTalk360 team about the event. We were asked to assemble in the event venue at 8.30 AM. To our surprise, the participants were split into different teams and each one of us was in the different team. This was a nice experience as we got to know different people as the participants were people from different professionals. We were given cards with our photo attached and the table number written on it. Everything was a team activity with a team coordinator for each team.

Design think involves four stages namely

  • Discover – understand people and their ideas
  • Insight – Identify trends and inspire innovation
  • Dream – Ideate solutions for problem statements defined
  • Disrupt – Prototyping techniques that visualize solutions

Design Thinking Summit 2017

The first day was about “Insight”. The first step towards insight is “Discover”. The foremost task is to understand people and their ideas. The “Insight” stands for identifying trends and pattern of data which will inspire innovation. The below quote explains it.

Fall in love with the problem and not with the solution

Products must be created for behaviors and not for intentions

The first day started with an event to come up with an innovative team name for each team. The stationeries were provided along with post-its. An interesting this to be noted at the venue (IIM- Bangalore) was that plastics are banned and we were given glass water bottles with our names printed on them. There were around 12 teams and each team came up with unique names.

The interesting interview:

The next event was an interesting interview with a reputed industrialist. The aim was to capture the insights of the person and utilize them for a better understanding of the requirements.  We were asked to listen to the interview and note down our points in the post – its. The important feature of the post is that we cannot write long stories in it. The notes must be short and understandable. Hence, we need to make sure we have better words to describe our points and ideas. Some of the key insights derived from the interview were:

  • Be focussed on process
  • Build expertise and use them when opportunity is given
  • Soft skills to be more focused.

For example, consider a scenario where we gather the requirements for a product from a customer. The skills to be observed in this process is:

  • Asking open ended questions
  • Listening skills
  • Observing skills

The Research Methodology:

Once the customer requirements are gathered, the next step is to dig deep into them for better understanding. One of the research methodologies was:

Ecosystem Map:

This is a visual representation of landscape within which a problem exists. The map contains the connections between the different stakeholders involved in the problem. We can visually depict the interconnections and inter-dependencies between the stakeholders in the system. This way we can draw key inferences and insights by asking questions like, what are the challenges in the system, what can be improved, what interventions can be made to make a positive impact.

Arriving at the problem statement:

We now have the ecosystem map. The next activity is to identify the problem statement. We can consider anyone of the stakeholders and derive the statement for them. The stakeholder may be a customer, an employee, the government or the senior management of the organization. Each individual team member was asked to write down his/her problem statement based upon the following points describing:

  • User characteristics
  • Outcome the user tries to achieve
  • Barrier existing to achieve the outcome
  • Root cause of the barrier
  • How the user feels because of the root cause and the barrier

This problem statement is important because it is from this point, we will move forward in deriving the solution for them. From the individual points, the team coordinator would discuss and come with a single problem statement for the team. The problem statement is written from the user’s point of view and it helps to identify and articulate the right problem to solve for the users.

There are different tools which help us in deriving the problem statement which may be:

Empathy map – mapping the different data points for the user

Subway map – plotting the objectives with respect to the current state and prioritizing them.

Design Thinking Summit 2017

User persona – Using quote cards, we can derive the insights for different problems given in the cards.

Journey line – steps involved from arriving at the problem statement to improving on the solutions

These tools are considered the convergent research technique tools for understanding the problem better. At the end of the first day, the Pensaar team collected the feedback about the activities conducted.

Day 2 at the Design Thinking Summit:

The first day went interesting and the outcome was the problem statement. Now comes the second day of the event. We were all more excited for the second-day activities. The second day started with the Introduction of the Pensaar team, who are behind the screen for this wonderful Summit.

The agenda for this day was “Dream”. The first day resulted in finding the problem statement with the insights obtained from the different groups. Now, we need to walk our way to find the solution for this problem. But that would not be so easy. One problem statement would be worked upon by all the 12 teams. So, there would be different solutions and it’s important that we identify the best solution.

Arriving at the Customer Benefits:

The first activity for the second day was to “Identify three key customer benefits”. Customer benefit leads to improvement in customers’ life. It is what matters most to the customer when choosing our product over others. The benefits can be measured through certain metrics, which help you in identifying right priorities to acquire many customers. It can be done by crafting a creative Q starting with “How might we”. This lets you to reframe the problem as an opportunity and ideate solutions with a sense of optimism and see the possibilities

Lunch break:

There was another surprise waiting for us during the lunch break. It was picnic lunch for each team. The team members had to collect the lunch for their team mates and have under the trees in a different area. This was very interesting and we all enjoyed it.

Tools for Ideating:

The next step is to ideate solutions for the problem statement based on the key customer benefits. This was the next activity given. There are various tools that are used for the ideation and few of them were given for the teams for activity. Few among them are:

Question storming:

This is a method for discovering the questions to make breakthrough differences in problem-solving, innovation, operational excellence and culture. The questions must be focussed on the facts and situation to get the root of a problem.

Emerging Tech cards:

These are small cards containing information about the emerging technologies in different areas. The activity was to identify the relevant tech card and find out how to make use of it in identifying the solution to the problem.

Design Thinking Summit 2017

Biomimicry:

This is drawing inspiration from nature to design the solution. Simply put its mimicking nature to inspire sustainable and innovation solution. We can take an example of ants and their ability to self-organize to find the shortest route. This can be used to find the best solution.

World Café:

This was a post lunch activity. The teams were asked to write the problem statement and the ideas for a different solution. It is to build a collaboration among the teams than to be an individual. So each team member would be visiting other teams to gather knowledge about their ideas and provide some inputs for the improvements.

With this activity, we came to an end for the second day.

Day 3 at the Design Thinking Summit:

The day 3 was even more filled with enthusiasm among the team because we all had new friends and the past two days gave us a different experience. This day started with the activity for “Disrupt”. This will develop prototypes for the solutions derived and then be experimenting them. It started with

Story Board:

It’s a visual tool to build a narrative around the solution to get feedback and refine the concept. The teams were asked to build the story board with their problem statement and the solutions.

Design Thinking Summit 2017

Message Map:

This is an excellent tool to create an elevator pitch to communicate our concept to users in less than 15 secs. The steps include creating a Twitter-friendly message about the solution and adding supporting points to explain it.

Design Thinking Summit 2017

Experimenting the solution:

The final activity of the event was experimenting the solutions. Each team was asked to create an experiment card which includes the hypothesis, the experiment. Metric and the outcome. With this card, we can experiment our solutions with different users and find the outcome. The teams move around IIM to find the users and the filled in those cards according to the responses received. It was totally a different experience where we also traveled out to find the users and got the feedback from them.

Conclusion:

It was totally a fantastic experience for all of us. Design thinking starts from identifying the exact problem statement (Insight), ideating through different solutions (Dream) and experimenting those ideas (Disrupt) for the development of an employee as well as an organization. These tools can also be utilized in our day to day activities for the betterment of our life as well as career. Thanks to BizTalk360 for giving us a chance to participate in this event and looking forward to more such events.

Author: Praveena Jayanarayanan

I am working as Senior Support Engineer at BizTalk360. I always believe in team work leading to success because “We all cannot do everything or solve every issue. ‘It’s impossible’. However, if we each simply do our part, make our own contribution, regardless of how small we may think it is…. together it adds up and great things get accomplished.” View all posts by Praveena Jayanarayanan

INTEGRATE 2017 – Recap of Day 3

INTEGRATE 2017 – Recap of Day 3

After a scintillating Day 1 and Day 2 at INTEGRATE 2017, the stage was perfectly set for the last (Day 3) day of the event. Before you proceed further, we recommend you to take a read at the following links –

Quick Links

Session 1 – Rethinking Integration by Nino Crudele

Day 3 at INTEGRATE 2017 started off with the “Brad Pitt of the Integration Community” – Nino Crudele. It was a perfect start to the last day of this premier integration focused conference.

Nino started off his session by thanking his mentor, a fellow MVP for instilling knowledge about Power BI. This session was based on true experience. Nino shared his experience of how he calls the job as his passion with three different types of jobs – Bizzy (BizTalk), DEFCON1, and Chicken Way. In this context, what Nino refers to the Chicken way is the way in which you can actually solve the problem – you can take a direct or an indirect approach to solve the problem.

Nino even had some Chicken Way Red Cards to give away to the community and some reactions to that were –

Then Nino presented the most comical slide of the entire #Integrate2017 event – a question / answer from his 12-year old daughter about BizTalk.

The above slide shows how people perceive the technology actually. Therefore, it’s imperative that you have to choose the proper technology to solve the specific problem and make the customer happy. Nino also explained what according to him are the top technology stacks and made a mention that “BizTalk is SOLID” – a very solid technology platform.

Then Nino gave an example of his customer experience where the customers were using 15 BizTalk Servers! :O Nino suggested changes to certain approaches in their business process, and the way to get the real time performance improvement. The customer was also looking for a real fast hybrid integration (point to point) with BizTalk in the project with real time monitoring, tracing and so on. Nino suggested a framework that was completely built over the cloud. This approach was more reliable and the customer had complete control over the messaging system, scalable and so on. The solution made use of Logic App, Event Hubs, Service Bus, Blob storage and many more such integration solutions which made the customer happy.

The session moved into a cool demo from Nino (real time data visualization in Power BI using custom visualization) which you can get to watch when the videos go Live on the INTEGRATE 2017 website.

Session 2 – Moving to Cloud-Native Integration by Richard Seroter

The second session of the day was from Richard Seroter on Moving to Cloud-Native Integration. Richard started off his talk with the analogy of “theory of constraints” where processes are limited to throughput by the constraint (bottleneck). In any software environment, you have to focus on what is the constraint that is slowing you down and optimize it. In an organization environment, there are chances that the “integration might itself be the constraint” to slow things and slow down the business.

Therefore, Richard introduces the concept of cloud native integration to connect different systems.

Integration Today

According to Gartner, in current scenario, application-to-application integration is the most critical integration scenario, while few years down the line, cloud service integration will rise to the top. The actual spending on integration platforms is on the rise with the fastest growth in iPaaS and API Management.

Again, Gartner says, by 2020, 75% of the companies will establish a hybrid integration platform using an infrastructure that they assemble from different vendors. By 2021, atleast 50% of large companies will have incorporated citizen integrator capabilities into their integration infrastructure.

What is Cloud Native?

Cloud native is basically “how” do I build a software!

The following image clearly shows the difference between a traditional enterprise and a cloud native enterprise.

Delivering Cloud Native Integration

  • Build a more composable solution that is
    • Loosely coupled
    • Makes use of choreographed services
    • Push more logic to the endpoints
    • Offer targeted updates

Richard then jumped into his demos where in the first demo, he used a Logic App as a data pipeline. The Logic App receives a message from the queue, call a service running in Azure App service, call a Azure function that does some fraud processing, and feed the result message back to the queue for further processing.

To feed the queue, Richard deploys another Logic App where a file is picked up from OneDrive, parse the file as a JSON array and dump it to the queue which is on the other Logic App.

That’s not it! Richard had few more demos in store – Making BizTalk server easy where he used BizTalk 2016 FP1 Management APIs to create BizTalk artifacts self-service style, and automate Azure via Service Broker.

We recommend you to watch this session when the video is made available in a week’s time on the INTEGRATE 2017 website.

Session 3 – Overcoming Challenges When Taking Your Logic App into Production

Stephen started off with a key announcement about the readiness of a New Pluralsight Course – “BizTalk Server Administration with BizTalk360“. The course will be made available shortly.

Phase 1 of the session was targeted towards ‘Decision Making‘, phase 2 was on what we did right and wrong, and the last phase with some important tips.

Decisions

Stephen compared building a .NET parser solution to Logic Apps development. Logic Apps was calculated to have finished earlier and way cheaper. They even questioned if Integration Account are worth the price ($1000 per month)

What’s Wrong and Right?

    • Make design decisions based on the rules on the serverless platform and factoring costs per Logic Apps action
    • Stephen described that initially he used 2 subscriptions in 2 regions, but this made deployment across regions hard. Therefore, the best practice is to have one subscription in one region
    • Solution structure – Solution level maps to a resource group, use one project per Logic App, maintained 3 parameter files, one per environment. For performing deployment you can create a custom VM.
  • Serverless – is AMAZING, but sometimes things break for no fault of your own, sometimes Microsoft support needs to be called in for support/fixing issues

Tips

  • Read the available documentation
  • Don’t be afraid for JSON – code view is still needed especially with new features, but most of the time are soon available in designer and visual studio. Always save or check-in before switching to JSON.
  • Make sure to fully configure your actions, otherwise you cannot save the Logic App
  • Ensure name of action, hard to change afterwards
  • Try to use only one MS account
  • If you get odd deployment results, close / reopen your browser
  • Connections – Live at resource group level. Last deployment wins. Best practices: define all  connection parameters in one Logic App. One connection per destination, per resource group.
  • Default retries – all actions retry 4 additional times over 20s intervals.
  • Control using retry policies
  • Resource Group artefacts – contain subscription id, use parameters instead
  • For each loop – limited to 100000 loops . default to multiple concurrent loops, can be changed to sequential loops
  • Recurrence – singleton
  • User permissions (IAM) – multiple roles exist like the Logic App Contributor, and the Logic App Operator

With that, it was time for the attendees to take a break!

After the break, Duncan Barker from the BizTalk360 team took the stage to thank the wonderful team at BizTalk360 for all their effort in making INTEGRATE 2017 a great success!

Session 4 – BizTalk Server Deep Dive into Feature Pack 1

Tord was given a warm welcome with the song “Rise” by Katy Perry. Tord complimented the welcome by saying how good friends he and Katy Perry are and the story behind how she wrote the song for BizTalk. 🙂

Fun aside, Tord started off the session by saying how BizTalk Server 2016 almost got a pink theme for the icons! :O Just hours before the team was to do the final build for BizTalk Server 2016 Feature Pack 1 release, one of the engineers pointed out the pink stroke on the outside of all icons. The team managed to fix and ship the release.

But, do you know! There is one tiny pixel of pink somewhere in some icon? If you find it, send Tord an email and he will send you a nice gift!

BizTalk Connector in Logic Apps is now Generally Available with Full Support!!!

Microsoft IT team have built a first class project to help migrate easily to BizTalk Server 2016. You can get your downloadable version of the application from the below link. If migration is what is holding you, then make use of this application.

With BizTalk Server, you can do so many things! You can take advantage of the cloud through BizTalk Server. Tord walked through the different features that were released as a part of Feature Pack 1 in detail with some Live Demo.

Session 5 – BizTalk Server Fast & Loud

After that power packed introduction from Daniel Szweda for Sandro Pereira comparing him with Cristiano Ronaldo (who as well hails from Portugal), guess what happened! SANDRO PEREIRA forgot to Turn on his machine to show his presentation :O The IT admin guy at Kings Place almost had to show up 5 – 6 times to get the “problem” solved, and Sandro termed it with the famous word “Jetlag” that was associated with most speakers during any technical issues 😛 🙂 And.. there was a roar when the presentation worked for Sandro! Phew … There goes the BizTalk Wiki Ninja, BizTalk Mapper Man, The Stencil Guy into his session.

Sandro started off his session with this slide

Sandro’s session was more towards BizTalk Server optimization and performance. The points discussed in this session were –

SQL Server

  • Clients still don’t have BizTalk Jobs running
  • Comparing in a Car terminology,
    • BizTalk Server is the Chassis
    • SQL Server is the Engine
    • Hard Drivers is the Tiers
    • Memory is the Battery
    • CPU is the Fuel Injector
    • Network and Visualization Layer is the Exhaust pipe
  • Make sure BizTalk Server and SQL Server Agent jobs are configured and running
  • Treat BizTalk databases as a Black box
  • Size really matters in BizTalk! Large databases impact performance (Eg., MessageBoxDB, Tracking database)
  • Consider to dedicate SQL resources to BizTalk Server
  • Consider splitting the TempDB into multiple data files for better performance

Networking

  • Speed defines everything for this layer
  • At a minimum, you need to have 1 logical disk for data files, 1 for transaction log files, and 1 for TempDB data files
  • Remove unnecessary network
  • Scaling out is not a solution to all problems – sometimes you may also have to scale in to solve a problem!

Session 6 – BizTalk Health Check – What and How?

The last session before lunch was on BizTalk Health Check – What and How? by Saffieldin Ali. BizTalk Health Check is something similar to the MOT Testing that’s performed on vehicles in UK. MOT Testing is a compulsory test for exhaust and emissions of motor vehicles.

In BizTalk, the health check is performed to –

  • Identify symptoms and potential problems before it affects production environment
  • Review critical processes to achieve minimum downtime due to disaster recovery
  • Identify any warnings and red flags that may be affecting users
  • Understanding of common mistakes made by administrators and developers
  • Understand the supportability and best practices

BizTalk Health Check Process

Interviewing

  • Operations Interview (1-1 meetings with admins/dev teams to collect operational view of things)
  • Knowledge Transfer

Collecting

  • Run collection tools (BizTalk Health Monitor etc)
  • Collect informal information (say, I did something wrong last week during an informal discussion)

Analysis and Reporting

  • Run and examine analysis tools results
  • Write and Present final conclusion

BizTalk Health Check Areas

  1. Platform configuration for BizTalk Server
  2. BizTalk Server Configuration
  3. BizTalk Performance
  4. Resilience (High Availability)
  5. SQL Server Configuration for BizTalk Server
  6. Disaster Recovery
  7. Security
  8. BizTalk Application Management and Monitoring

BizTalk Health Check Key Tools

  1. Microsoft Baseline Security Analyser (MBSA)
  2. BizTalk Best Practices Analyser
  3. BizTalk Health Monitor (BHM)
  4. Perf Analysis of Logs (PAL)

Safieldin showed how each of the above products work and how they perform the checks on the BizTalk environment.

It was time for the community to break out for Lunch and some networking before the close of the event in the next couple of hours.

Session 7 – The Hitchhiker’s Guide to Hybrid Connectivity by Dan Toomey

The last leg of #Integrate2017 was something quite significant. All the 3 speakers – Daniel Toomey, Wagner Silveira and Martin Abbott are the ones who have flown into London after some long flights. Dan and Martin from Australia (about 20 hours) and Wagner from New Zealand (about 30 hours!).

Post lunch, it was time for Dan Toomey from Australia to take the stage to talk about The Hitchhiker’s Guide to Hybrid Connectivity.

Dan started his talk about the types of Azure Virtual Network –

  • Point to Site (P2S) – Something similar to connection when you work from home and connect to corporate network (connect to Citrix/VPN) over the internet
  • Site to Site (S2S) – taking an entire network and joining with another network over the internet
  • ExpressRoute – something like taking a giant cable (managed by someone else) and connecting your corporate network on that.

VNET Integration for Web/Mobile Apps

  • Requires Standard and Premium App Service Plan
  • VNET must be in the same subscription as App Service Plan
  • Must have Point to Site enabled
  • Must have Dynamic Routing Gateway

VNET with API Management

If you have API Management that is sitting in your Virtual Network with access to your Corporate Network gateway, you will get:

  • Added layer of security
  • All benefits of API Management (caching, policies, protocol translation [SOAP to REST], Analytics, etc)

Non-Network based Operations

Azure Relay (an alternate approach) – This is a new offering with Azure Service Bus

    • WCF Relay
    • Hybrid Connections
      • Operates at transport level

On-Premises Data Gateway

  • Generally available since 4th May 2017
  • Acts as a bridge between Azure PaaS and on-prem resources
  • Works with connectors for Azure Logic Apps, Power Apps, Flow and Power BI

Daniel wrapped up his talk by talking about the following business scenarios –

  1. Azure Web/Mobile App to On-Prem
  2. IaaS Server (VM) to On-Prem
  3. SaaS Service to On-Prem
  4. Business to Business
  5. Service Fabric Cluster to On-Prem

To know more about these scenarios that Dan talked about, please watch the video which will be made available soon.

Session 8 – Unlocking Azure Hybrid Integration with BizTalk Server by Wagner Silveira

In this session, Wagner started off his talk speaking about Why BizTalk + Azure, and what BizTalk brings to Hybrid Integration –

  • On-premises adapters
  • Azure adapters
  • Separation of concerns
  • Availability
  • For existing users
    • Leverage investment into the platform
    • Continuity to developers

Wagner talked about the ways in which you can connect to Azure in detail along with some scenarios-

  • Service Bus
  • Azure WCF Relay
  • App Services/API Management
  • Logic Apps

Wagner showed an exciting demo for 2 Line of Business (LoB) systems and finally some tweets coming out of Logic Apps.

Session 9 – From Zero to App in 45 minutes (using PowerApps + Flow) by Martin Abbott

There we were! The last session at #Integrate2017. Obviously not a good feeling being the speaker as you would be closing what was an amazing 3 days of learning and experience. But Martin did a great job in showing the power of PowerApps and Flows and showed how you can build an application in 45 minutes using the combo.

Martin started off his talk talking about Business Application Platform Innovation which is represented in a very nice diagram.

Martin just had 3 slides and it was an action packed session with demo to create an application in under 45 minutes. We recommend you to watch the video which will be available shortly on the event website.

Key Announcement – Global Integration Bootcamp 2018

Martin was one of the organizers of the recently concluded Global Integration Bootcamp event in March 2017. It’s now official that we will have the #GIB event in 2018. The event will happen on 24th March, 2018. You can follow the website http://www.globalintegrationbootcamp.com/ for further updates.

Sentiment Analysis on #Integrate2017

In the Day 1 Recap blog, we had shown some statistics on the sentiment analysis of tweets for hashtag #Integrate2017. Here is one last look at the report at 00:00 (GMT+0530) on June 29, 2017.

And, with that!!! It was curtains down on what has been a fantastic 3 days at INTEGRATE 2017. Well, we are not just done yet! As announced on Day 1 by Saravana Kumar, INTEGRATE 2017 will be back in Redmond, Seattle, USA on October 25-27, 2017. So if you missed attending this event in London, come and join us at Redmond.

We hope you had a great time at INTEGRATE 2017. Until next time, adios!!!

In case you missed it!

Author: Sriram Hariharan

Sriram Hariharan is the Senior Technical and Content Writer at BizTalk360. He has over 9 years of experience working as documentation specialist for different products and domains. Writing is his passion and he believes in the following quote – “As wings are for an aircraft, a technical document is for a product — be it a product document, user guide, or release notes”. View all posts by Sriram Hariharan

INTEGRATE 2017 – Recap of Day 2

INTEGRATE 2017 – Recap of Day 2

After an exciting Day 1 at INTEGRATE 2017 with loads of valuable content from the Microsoft Pro Integration team, it was time to get started with Day 2 at INTEGRATE 2017.

Important Links – Recap of Day 1 at INTEGRATE 2017,
Photos from Day 1 at INTEGRATE 2017

Session 1 – Microsoft IT journey with Azure Logic Apps by MSCIT team

Day 2 at INTEGRATE 2017 started off with Duncan Barker of BizTalk360 introducing Mayank Sharma and Divya Swarnkar from the Microsoft IT Team. The key highlights from the session were –

    • Integration Landscape at Microsoft has over 1000 Partners, 170M+ Messages per month, 175+ BizTalk Servers, 200+ Line of Business Systems, 1300+ Transforms and a Multi platform that supports BizTalk Server 2016, Azure Logic Apps, and MABS
    • Microsoft IT Team showed why the team were motivated to move to Logic Apps –
      • Modernization of Integration (Serverless Computing + Managed Services, business agility and accelerated development)
      • Manage and Control Costs based on usage
      • Business Continuity
    • The following image shows where the MSCIT team is placed today in terms of number of releases. Microsoft Azure BizTalk Services will be retired by end of July.
    • Microsoft IT team uses Logic App pipeline to process EDI messages coming from partners
    • For testing purposes, Microsoft IT team uses Azure API Management policies to route the message flows to parallel pipelines for testing purposes
    • The team at Microsoft IT uses Operations Management Suite (OMS) for Logic Apps diagnostics. This was briefly covered earlier by Srinivasa Mahendrakar in one of the Integration Monday sessions – Business Activity tracking and monitoring in Logic Apps. Microsoft IT have migrated all their EDI workloads off of MABS and BizTalk and onto Logic Apps.
    • Microsoft IT only uses BizTalk for its adapters to connect to LOB systems, while all processing happens in Logic Apps.
    • Finally, the team shared their learnings while working with Logic Apps
      • Each Logic App has a published limit – make sure you understand what they are
      • Consider the nature of flow you will create with Logic Apps – high throughput or long running workflows
      • Leverage the platform for concurrency (SplitOn vs. ForEach)
      • Understand the structure and behavior of data (batched vs. non-batched)
      • Consider a SxS strategy to enable test in production
      • In Logic Apps, your delivery options are ‘atleast once’ or ‘at most once’ (not ‘only once’)

Jim Harrer was really appreciative and thankful to the Microsoft IT team for making their trip to London to share their experiences.

Session 2 – Azure Logic Apps – Advanced integration patterns

This was one of the most expected sessions on Day 2 at INTEGRATE 2017 with Jeff Hollan (Sir Hollywood) and Derek Li talking about “Advanced integration patterns”. The agenda of the session included talks on –

  • Logic Apps Architecture
  • Parallel Actions
  • Exception Handling
  • Other “Operation Options”
  • Workflow Expressions

The Logic Apps architecture under the hood looks as follows –

An important point to observe is that the ForEach loop in Logic Apps runs the tasks in parallel!

Awesome overview from @jeffhollan @logicappsio on how #LogicApps are executed by the runtime. No thread management needed!!

The Logic Apps designer is basically a TypeScript/React app that uses OpenAPI (Swagger) to render input and output. The Logic Apps designer has the capability to generate Workflow definition (JSON). You can configure the runAfter options via the Logic Apps designer.

This statistic made by Jeff Hollan was probably the highlight of the show

In the history of #LogicApps, there hasn’t been a single run that hasn’t executed at least once.

After a very interesting demo by Derek Li, Jeff Hollan started his talk on Workflow Expressions. An expression is anything but any input that will be dynamic (changes at every run). Jeff explained the different expression properties in a easy to understand way –

@ – Used to indicate an expression. It can be escaped with @@. Example – @foo()

() – Encapsulate the expression parameters – Example – @foo(‘Hello World’)

{} – “Curly braces means string!!!“. This is same as @string() – Example – @add{(1,1)}

[] – Used to parse properties in the JSON objects – Example – @foo(‘JsonBody’) [‘person’][‘address’]

This session from Jon Fancey and Derek Li was well received by the audience at #Integrate2017.

Jon also made the mention about the feature where customers can test the expressions in the designer which is coming soon!

Session 3 – Enterprise Integration with Logic Apps by Jon Fancey

In this session, Jon Fancey started off his presentation by talking about Batching in Logic Apps and how it works –

  • There are basically two Logic Apps – Sender and Receiver
  • Batcher is aware of the Batching Logic App; whereas Batching Logic App is not aware of the batchers (1:n)

What’s coming in Batching?

  1. Batch Flush
  2. Time based Batch release trigger options
  3. EDI Batching

Jon Fancey moved into the concept of Integration Account (IA) and made the mention about the VETER pipeline being available as a template in Azure Logic Apps using Integration Account.

  • Integration Account is the core to XML and B2B capabilities
  • IA provides partner creation and management
  • IA provides for XML validation, mapping and flatfile conversion
  • Provides tracking

Jon listed the Logic Apps enhancements coming soon for working with XML such as:

  • XML parameters
  • Code and functoids
  • Enhancements soon
    • Transform output format (XML, HTML, Text)
    • BOM handling

Jon showed a very interesting demo about how to transform an XML message with C# and XSLT in Logic Apps. You got to wait a little longer till the videos are made available on the INTEGRATE 2017 event website 🙂

Disaster Recovery with B2B, and how it works?

In the final section of his presentation, Jon discussed about the Monitoring and tracking of Azure Logic Apps. This part was covered by Srinivasa Mahendrakar on one of his recent Integration Monday sessions.

Jon showed an early preview (mockup) of the OMS Dashboard for Azure Logic Apps that’s coming up soon. With this, you can perform Operational Monitoring for Logic Apps in OMS with a powerful query engine. You can expect this feature to be rolled out mid-July!

With that, completed the first set of sessions for the morning on Day 2 at INTEGRATE 2017.

Session 4 – Bringing Logic Apps into DevOps with Visual Studio and monitoring by Jeff Hollan/Kevin Lam

Once again, but unfortunately for the last time on stage, it was time for Sir Hollywood Jeff Hollan to rock the stage with his partner Kevin Lam to talk about bringing Logic Apps into DevOps with Visual Studio and monitoring.

The key highlights from the session include –

Visual Studio tooling to manage Logic Apps

  • Hosted the Logic App Designer within Visual Studio
  • Resource Group Project (same project that manages the ARM projects)
  • Cloud Explorer integration
  • XML/B2B artifacts

Make sure you have selected “Cloud Explorer for Visual Studio 2015 and Azure Logic Apps Tools for Visual Studio” these tools in order to be able to use Logic Apps from Visual Studio. It also works on Visual Studio version 2015/2017.

Kevin and Jeff showed the demo of the Visual Studio tooling with a real time example of using Logic Apps in Visual Studio.

Azure Resource Templates

  • You can create Azure Resource Templates that get shipped on to Azure Resource Manager.
    • Azure Resources can be represented and created via programmatic APIs that are available at http://resources.azure.com. This is a pivot to Azure where you are looking at the API version of your resources.
  • Resource templates define a collection of resources to created
  • Templates include –
    • Resources that you want to create
    • Parameters that you want to pass during deployment (for example)
    • Variables (specific calculated values)
    • Outputs

Service Principal

With this, you can get authorization to an application that you create and then say the application has access to the resources.

Jeff wrapped up the session by showing a demo of how the deployment process works, in detail. You can watch the video that will be available in a week’s time on BizTalk360 website for the detailed understanding of the steps to perform a deployment.

With this wrapped up the 1.5 days of sessions from Microsoft on core integration technologies, and what’s coming up from them in the coming months. It was now time for the Integration MVPs to take the stage and show what they’ve done/achieved, or what they can do with the various offerings from Microsoft.

Session 5 – What’s there & what’s coming in BizTalk360 & ServiceBus360 by Saravana Kumar

Saravana was given a “warm” welcome with a nice music and a loud applause from the audience! 🙂 Saravana thanked the entire Microsoft team for their presence and effort at INTEGRATE 2017 over the last 1.5 days.

Key Highlights from Saravana’s session

BizTalk360 Updates

  • BizTalk Server License Calculator
  • Folder Location Monitoring
    • File, FTP/FTPS, SFTP
  • Queue Monitoring
  • Email Templates
  • Throttling Monitoring
  • On-Premise + Cloud features
    • Azure Logic Apps Management
    • Azure Logic Apps Monitoring
    • Azure Integration Account
    • Azure Service Bus Queues (monitoring)

You can get started with a 14-day FREE TRIAL of BizTalk360 to realize the full blown capabilities of the product.

ServiceBus360

Saravana discussed the challenges with Azure Service Bus and how ServiceBus360 helps to solve the Operations, Monitoring and Analytics issues of Azure Service Bus.

You get ServiceBus360 with a pricing model as low as 15$. We wanted to go with a low cost, high volume model for ServiceBus360. You can also try the product for FREE if you are keen on trying the product. If you are an INTEGRATE 2017 attendee, we have a special offer for you that you cannot afford to miss.

With that it was time for the attendees to break for lunch on Day 2 at INTEGRATE 2017. Lots more in store over the remaining 1.5 days!

Post Lunch Sessions – Session 6 – Give your Bots connectivity, with Azure Logic Apps by Kent Weare

We’ll take you through a quick recap of the post lunch sessions on Day 2 at INTEGRATE 2017.

Kent Weare started off his talk about his company and how they are coping up to the business transformation demands from the government and local bodies in Canada. Kent then shows how their company has grown over the years and how much it will mean to them in terms of cost of business transformation. The approach they have taken is by moving towards “Automating Insight, Artificial Intelligence,  Machine Learning, and BOTS”.

Kent then showed why BOTS are gaining the popularity these days – to Improve Productivity! Bots is something very similar to IMs which users are very familiar with.

Kent then stepped into his demo where the concept was as follows –

Kent wrapped up his session with the following summary for companies to take advantage of the latest technology in store these days.

Session 7 – Empowering the business using Logic Apps by Steef-Jan Wiggers

After Kent Weare, Steef-Jan Wiggers took over the stage to talk about Empowering the business using Logic Apps. This talk from Steef-Jan Wiggers was more from the end user/consumer perspective of using Logic Apps.

Steef took a business case of a company called “Cloud First” that wanted to move to the cloud (and chose Azure). All his talk in this session was focussed towards this company who wanted to migrate to cloud with minimal customization and by having a unified landscape. Steef also showed some sentiment around the developer experience with Logic App.

Steef showed a demo that calculates the sentiment of #Integrate2017 (which is exactly something similar folks at BizTalk360 also have tried and reproduced in the Day 1 Recap blog).

After the end of the demo, Steef talked about the Business Value of Logic Apps –

  • Solving business problem first
  • Fit for purpose for cloud integration’
  • Less cost; Faster time to market

Session 8 – Logic App continuous integration and deployment with Visual Studio Team Services

After Steef, Johan Hedberg took the stage to talk about Logic App continuous integration and deployment with Visual Studio Team Services. Johan set the stage for the session by giving a example –

  • Pete is a web developer who loves the Azure Portal and has an amazing time to market. Generally, he is fast but has no process.
  • Charlotte loves Visual Studio. She wants to bring the Logic App from Visual studio with Source control.
  • Bruce is an operations guy. He does not like Pete and Charlotte having direct access to production. He likes to have a process over anything and would want to approve things before it goes out.

Therefore, what all 3 of them are missing is a common process/pipeline of how to perform things such as –

  • Lack of development standards
  • Process standards
  • Security standards
  • Deployment standards
  • Team communication and culture, and more

Therefore, in this session (and demo), Johan shows how users can use continuous integration and deployment with Visual Studio Team Services using Logic Apps.

Sessions 9 & 10 – Internet of Things

In the last two sessions of Day 2 at INTEGRATE 2017, Sam Vanhoutte and Mikael Hakansson talked about Integration of Things (IoT).

Sam Vanhoutte talked about why integration people are forced to build good IoT solutions. He showed the IoT End-to-End value chain with a nice diagrammatic representation.

Then Sam talked about the different points in the Industrial IoT Connectivity challenge. The points are –

  • Direct connectivity (feels less secure)
  • Cloud gateways (easier to start in a cloud setup)
  • Field gateways (feels more secure)

Sam spoke about Azure IoT Edge, the required hardware for Azure IoT Edge and more about flexible business rules for IoT solutions.

Mikael Hakansson started off his IoT talk from where Sam Vanhoutte left the speech, but there came the fun part of the session. Sandro Pereira had to stop Mikael from delivering his presentation and make him wear the “Green” color shirt for losing a bet (well not sure if Mikael was a part of that bet at all and his friends unanimously agreed he lost the bet 🙂 )in a football match. (so did Steef-Jan Wiggers and he was wearing a green shirt too!)

Mikael started off his talk about IoT === Integration and he introduced the concept of Microsoft Azure IoT Hub in detail.

  • Stand-alone service or as one of the services used in the new Azure IoT Suite
  • With Azure IoT Hub, you can connect your devices to Azure:
    • Millions of simultaneously connected devices
    • Per-device authentication
    • High throughput data ingestion
    • Variety of communication patterns
    • Reliable command and control

Mikael gave a very cool demo on IoT with Azure Functions in his usual, calm way of coding while on stage. We recommend you to watch the video to see the effort that has gone behind to prepare for the demo and actually be able to code while presenting the session.

End of the Sessions

At the end of the session, it was curtains down on what promised to be another spectacular day of sessions at INTEGRATE 2017. The team gathered for a lovely photo shoot courtesy photographer Tariq Sheikh.

With that we would like to wrap our exhaustive coverage of Day 2 proceedings at INTEGRATE 2017. Stay tuned for the updates from Day 3. Until then Good night from London!

ICYMI: Recap of Day 1 at INTEGRATE 2017

Author: Sriram Hariharan

Sriram Hariharan is the Senior Technical and Content Writer at BizTalk360. He has over 9 years of experience working as documentation specialist for different products and domains. Writing is his passion and he believes in the following quote – “As wings are for an aircraft, a technical document is for a product — be it a product document, user guide, or release notes”. View all posts by Sriram Hariharan

Techorama 2017 Keynote – Recap

Techorama 2017 Keynote – Recap

Techorama is a yearly International Technology Conference which takes place at Metropolis, Antwerp. With 1500+ physical participants across the globe, the stage was all set to witness the intelligence of Azure. Among the thousands of virtual participants, I am happy to document the Keynote presented by Scott Guthrie, Executive Vice President of Cloud and Enterprise Group, Microsoft on Developing with the cloud. The most interesting feature of this demo is, Scott has scaled the whole demo on a Scenario driven approach from the perspective of a common developer. Let me take you through this keynote quickly.

Azure Mobile app

The Inception of cloud inside a mobile! Yes, you heard it right. Microsoft team has come up with Azure App for IOS/Android/Windows to manage all your cloud services. You can now easily manage all your cloud functionalities from Mobile.

Integrated Bashshell Client

Now the Bash Shell is integrated into the azure cloud to manage/retrieve all the azure services with just a type of a command. The Bash Shell client is opened in the browser pop-up and get connected to the cloud without any keys. More of the Automation scripts in future can get executed easily with this Bash in place. Also, it provides a CLI documentation for the list of commands/arguments. You can expect a Powershell client soon! 

Application Map

The flow between different cloud services and their status with all diagnostic logs and charts are displayed in the dashboard level. As a top-down approach, you can get to the in-depth level of tracking per instance based on failure/success/slow response scenarios with all diagnostics, stack trace and creation of a work item from the failure stack traces. From the admin/operations perspective, this feature is a great value add.

Stack trace with Work item creation

Security Center

Managing the security of the cloud system could be a complex task. With the Security center in place, we can easily manage all the VMs/other cloud services. The machine learning algorithms at the backend will fetch all the possible recommendations for an environment or the services.

Recommendations

The possible recommendations for virtual machines are provided with the help of Machine learning Algorithms.

Essentials for Mobile success

To deliver a seamless mobile experience to the user, you need to have an interactive user-friendly UI, BTD (Build, Test, Deploy automation) and scalability with the cloud infrastructure. These are the essentials for Mobile success and Microsoft with a Xamarin platform has nailed it.

A favorite area of mine has been added with much needed intelligent feature. Xamarin – VS2017 combo is now makings its step into a real-time debugging!!!

You can pair up your iPhone/any mobile device to the visual studio with the Xamarin Live player which allows you to perform live debugging. Dev-Ops support to Xamarin has now been extended, you can now make a build-test-deploy to any firmware connected to the cloud as like a Continuous Integration Build. Automation in testing and deployment for the mobile framework is the best part. You can get the real-time memory usage statistics for your application on a single window. Also, you can now run VS2017 on IOS as well. 🙂

The mobile features have not stopped with this. The VS Mobile center is also integrated here to make a staging test with your friend’s community to get feedback on your mobile application before we submit to any mobile stores. Cool, isn’t it.

SQL server 2017

Scott also revealed some features of upcoming SQL server 2017, which has a capability to run on Linux OS and Docker apart from Windows.

The new SQL Server 2017 has got Adaptive Query Processing and Advance Machine Learning features and can offer in-memory support for advanced analytics. Also, SQL server is capable of seamless failovers between on-premise and cloud SQL with no downtime along with Azure Database migration service.

Azure Database- SQL Injection Alerts

SQL injection could be the most faced problems of an application. As a remedy, Azure SQL database now can detect the SQL injection by machine learning algorithms. It can send you the alert when an abnormal query gets executed.

Showing the vulnerability in the query

New Relational Database service

The Relational Database service is now extended to PostgreSQL as a service and MySQL as a service which can seamlessly integrate with your application.

Data at Planet scale: COSMOS-DB

This could be the right statement to explain Cosmos DB. The Azure has come with Globally distributed multi-model database service for higher scalability and geographical access. You can easily replicate/mirror/clone the database based on the user base to any geographical location. To give you an example you can scale from Giga to Petabytes of data and from Hundreds to Millions of transactions with all metrics in place. And this makes the name COSMOS!

Scott has also shown us a video on how a JET online retailer is using cosmosDB and chat bot which runs with the Cosmos DB to answer intelligent human queries. With Cosmos DB and Gremin API you can retrieve the comprehensive graphical analysis of the data. Here, he showed us the Marvel comics characters and friends chart of Mr.Stark, quite cool!

Convert exist apps to Container based microservice Architecture

You may all wonder how to make your existing application to the Azure container based architecture and here is a solution with the support of Docker. In your existing application project, you can easily add the Docker which makes you run your application on the image of ASP.net with which it can easily get into the services of cloud build-deploy-test framework of continuous integration. A simple addition of Docker metadata file has made the Dev-ops much easier.

Azure stack

There are a lot of case studies which indicates the love towards azure functionalities but enterprises were not able to use it for tailor-made solutions. There comes an Azure-Stack, a private cloud hosting capability for your data center to privatize and use all cloud expertise on your own ground.

Conclusion

As more features including Azure Functions, Service Fabric, etc. are being introduced, this gist of keynote would have given you the overall view on The Intelligent Cloud and much more to come on the floor: tune to Techorama channel9 for more updates from 2nd-day events. With cloud scaling out with new capabilities, there will never be an application in future without rel on ing cloud services.

Happy Cloud Engineering!!!

Author: Vignesh Sukumar

Vignesh, A Senior BizTalk Developer @BizTalk360 has crossed half a decade of BizTalk Experience. He is passionate about evolving Integration Technologies. Vignesh has worked for several BizTalk Projects on various Integration Patterns and has an expertise on BAM. His Hobbies includes Training, Mentoring and Travelling View all posts by Vignesh Sukumar