Microsoft Integration Weekly Update: Nov 20, 2017

Microsoft Integration Weekly Update: Nov 20, 2017

Do you feel difficult to keep up to date on all the frequent updates and announcements in the Microsoft Integration platform?

Integration weekly update can be your solution. It’s a weekly update on the topics related to Integration – enterprise integration, robust & scalable messaging capabilities and Citizen Integration capabilities empowered by Microsoft platform to deliver value to the business.

If you want to receive these updates weekly, then don’t forget to Subscribe!

On-Premise Integration:

Cloud and Hybrid Integration:

Back to Cloud Services  via The Azure podcast

Feedback

Hope this would be helpful. Please feel free to let me know your feedback on the Integration weekly series.

Advertisements

Does BizTalk 2013R2 support IBM WebSphere MQ 9?

Does BizTalk 2013R2 support IBM WebSphere MQ 9?

My current customer has an integration landscape with a lot of IBM WebSphere MQ. After an upgrade to IBM WebSphere MQ 9 of a certain queue manager, we were no longer able to receive messages from a queue that was working perfectly before the upgrade. Sending was still working as before.

At that moment in time we were running BizTalk 2013R2 CU7 with Host Integration Server (HIS) 2013 CU4 and using the IBM MQ Client 7.5.0.8.

Our eventlog was full of these:

This setup was still working perfectly with IBM WebSphere MQ 7 and 8 queue managers. I also tried to update the MQ client to a higher version (8.0.0.7), but this resulted in even more errors…

The Solution: Host Integration Server (HIS) 2016

When you take a look at the System Requirements of HIS 2016 you see that it supports MQ 8. No mention of  MQ 9, I know… But it also supports BizTalk Server 2013R2! At this point we really needed a solution, so we took it for a spin!

I installed and configured everything in following order (the installation is always very import!):

  1. Install BizTalk 2013 R2
  2. Install BizTalk Adapter Pack
  3. Configure BizTalk
  4. Install BizTalk 2013 R2 CU7
  5. Install .NET 4.6.2 (required for HIS 2016)
  6. Install HIS 2016 (no configuration)
  7. Install IBM MQ Client 8.0.0.7 (64 Bit)
  8. Add MQSC Adapter to BizTalk
  9. Install HIS 2016 CU1
  10. Reboot Servers

For HIS 2016 I used the following minimal installation (as we only require the MQSC Adapter):

After all of this I was able to successfully do the following:

  • Send and Receive message from IBM WebSphere MQ 9 queues with “Transactional Supported” by using a 64 Bit Host Instance

Conclusion

At the end we reached our goal and were able to send and receive messages from an IBM WebSphere MQ 9 queue with BizTalk Server 2013R2.

Some people may ask why I didn’t use the Microsoft MQ Client. Well, I didn’t work straight away and we agreed to not further research this as we already started our migration project to BizTalk Server 2016.

Self-troubleshooting tools in BizTalk360

Self-troubleshooting tools in BizTalk360

We, the product support team, often receive different types of support cases reported by the customers. Some of them may be functional, others may be related to installation and so on. Every support case is a new learning experience and we put in our best efforts to resolve the issues, thereby providing a better experience to the customers. As the below quotes say,

“Customer success is simply ensuring that our customers achieve their desired outcome through their interactions with our company” – Lincoln Murphy

We must make sure that we are taking the customers in the right direction when they raise an issue and must give them confidence about the product as well as service.

As already known, BizTalk360 is the one stop monitoring tool for BizTalk server. BizTalk360 not only contains the monitoring options for BizTalk server, in turn contains other in-built tools as well such as BAM, BRE etc. Whenever an issue is reported by the customer, we start our investigation from the basic troubleshooting steps. Hope it would be interesting to know what are the basic troubleshooting that we do. Yes, it would be. In this blog, I will share the information about the basic troubleshooting tools that we have in BizTalk360 that help us in resolving the customer issues.

Installer Log Information:

The first step in using BizTalk360 is the installation and configuration. The installation, as well as the upgrade process, is seamless with simple steps and some of the permission checks are done in the installer. But there may be some cases where the installer may fail with the below error.

Self-Troubleshooting tools in BizTalk360

There is no much information about the error on the screen. So how do we check this error? Here comes the Installer logs for our help. Generally, when we install BizTalk360, we just give the name of the MSI in the admin command prompt and run it. But to enable installer logs we need to run the installer with the below command with the BizTalk360 version number.

msiexec /i “BizTalk360.Setup.Enterprise.msi” /l*v install.log

The installer log location can also be provided in the command, else the log will be created in the same folder where the MSI file is located. The steps performed during the installation will be logged in the installer log. The log will also contain information about any exception thrown. So, for the above error, the logged information was:

MSI (s) (E0:D4) [13:57:50:776]: PROPERTY CHANGE: Modifying CONNECTION_ERROR property. Its current value is ‘dummy’. Its new value: ‘Cannot open database “BizTalk360” requested by the login. The login failed.

Login failed for user ‘CORPsvcbiztalk360′.’.

The error clearly states that it is a permission issue. When BizTalk360 is installed, the BizTalk360 database gets created in the SQL server. BizTalk360 may be installed on the same machine where BizTalk server resides, or in a standalone machine and the SQL database may be on a separate server. As a prerequisite for BizTalk360, we recommend providing the SYSADMIN permission for the service account on the SQL server. Giving this permission to the service account resolves the above error. Hence, any installation related error information can be identified from the installer logs and can be resolved.

Troubleshooter:

This is an interesting tool which is integrated within BizTalk360 and available as a separate window-based tool. contains an extensive set of rules to verify all the prerequisite conditions in order to successfully run BizTalk360. As you can see in the below picture, the user just enters the password for IIS application pool identity and monitoring service account and clicks the “Troubleshoot BizTalk360” button. The rules will be verified and results will be indicated in the form of RED/GREEN/ORANGE.

This way, we can check the missing permissions for the BizTalk360 service account and provide the same.

Apart from the permissions, the other checks done by the troubleshooter are:

  • IIS Check
  • SQL Select Permission check
  • Configuration File check
  • Database report

If the customer faces any issue during the initial launch of the application, then they can run the troubleshooter and check for the permissions. Once the errors are resolved and everything is green, they can start BizTalk360 and it should work.

Self-Troubleshooting tools in BizTalk360

Self-Troubleshooting tools in BizTalk360

Hence all the information regarding the service account permissions, BizTalk360 configuration and database can be obtained with the help of troubleshooter. The integrated troubleshooter can be accessed from BizTalk360 itself as seen below.

Self-Troubleshooting tools in BizTalk360

BizTalk360 Service Logs:

The details of the exceptions that occur in BizTalk360 are captured in the log files that are generated in the BizTalk360 folder. The logs files not only contain the information of the exceptions, but they also contain the information about the alarm processing and the subservices statuses. There are different logs for each of them which are described below.

Monitoring Logs

Monitoring is one of the most important tasks performed in BizTalk360. There are new features getting added in every release of BizTalk360. The monitoring capability is also extended to File Locations, Host Throttling monitoring, BizTalk server high availability and much more. The BizTalk360 monitoring service is installed along with the BizTalk360 web application. Once the artefacts are configured for monitoring, the service runs every 60 seconds and triggers alert emails according to the conditions configured.

What happens if there occurs some exception during the monitoring and alerts are not triggered? Where can we find the information about these exceptions and take necessary actions? Here come the service logs that are located in BizTalk360 installation folder/Service/Log folder. There are about 25 different logs that get generated for each monitoring configuration separately and get updated whenever the monitoring service runs. Say for example, if the alerts are created, but not transmitted due to an exception, this information will be logged in the BizTalk360.SendMail.log file. So, when the customer raises an issue regarding the transmission of alerts, the support team starts the investigation from the logs. We ask the customers to share the logs from their environment and we check them. Let’s look at a customer scenario.

One of the issues came up such that:

  • The customer has configured a receive port for process monitoring
  • But they are getting Actual count = -1 even when there are messages processed via this port.
  • They wanted to know the reason why the actual count was -1.

In BizTalk360, the negative value denotes that there has been some exception occurring. And, the exception would be logged in to the service logs. Hence, we asked them to share the logs.

Self-Troubleshooting tools in BizTalk360

From the BizTalk360.ProcessMonitor.log, we could see the following exception:

2017-09-21 10:16:45 ERROR – ProcessMonitoringHelper:GetMonitoringStatus. Alarm Name:PROD_DataMonitor: Name: Application : Atleast 20 Messages from DHL per hour: Exception:System.Data.SqlClient.SqlException (0x80131904): Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding. —> System.ComponentModel.Win32Exception (0x80004005): The wait operation timed out at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)

The timeout exception generally happens when there is a huge volume of data in the BizTalkDTADb database since for Process Monitoring, we retrieve the results from this database and display it in BizTalk360. The database size was checked at their end and found to be 15 GB which was greater than the expected size of 10 GB. For more information on the database size, you can refer here.

Similarly, we have different log files generated from which we can get the information about the different sub services running for BizTalk360 Monitor and BizTalk360 Analytics services.

Self-Troubleshooting tools in BizTalk360

Self-Troubleshooting tools in BizTalk360

We can also check if all the subservices are started properly. The log information is captured along with the timestamp and this would make much easier for the support team to identify the cause and resolve the issues in time, thereby making the customer happy. In case of monitoring logs, the Alarm name and configurations are also captured. There are separate logs for Process Monitoring and other Data Monitoring alarms. We have separate logs for FTP and SFTP monitoring too for capturing the exceptions if any.

Analytics logs

Analytics is yet another important feature in BizTalk360 with help of which you can visualize a lot of interesting facts about your BizTalk environment like number of messages processed, failure rate at message type level, BizTalk server CPU/Memory performances, BizTalk process (host instances, SSO, rules engine, EDI etc) CPU/memory utilization and lots more. BizTalk360 Analytics service also contains different sub services run and any exception occurring for these services will be captured in the logs under C:Program Files (x86)Kovai LtdBizTalk360AnalyticsLog folder.

Self-Troubleshooting tools in BizTalk360

BizTalk360 Analytics is used to gather the information about the performance counters in the server and display them in the form of widgets. Also, BizTalk360 will display the information if the system is under throttling condition in a graphical format. There was a case from the customer that the Throttling Analyser was not displaying any formation when the system was under throttling condition.  We then checked the logs and found the below error in the BizTalk360.Throttling.log.

Self-Troubleshooting tools in BizTalk360

From the logs, we could understand that the performance counters were corrupted and rebuilding the counters resolved the issue.

Web and Svc logs

At times, there are scenarios where a page in BizTalk360 may take some time to get loaded leading to performance issues. The time taken to load the page can be captured in the svc logs present in C:Program Files (x86)Kovai LtdBizTalk360Web folder.

Self-Troubleshooting tools in BizTalk360

Once a customer reported that there was performance latency in some of the BizTalk360 pages. We checked these trace logs and found that the service calls GetUserAccessPolicy & GetProfileInfo methods were taking more than 30 seconds to get resolved.

GetUserAccessPolicy–>Groups/user assigned to provide access to the features of BizTalk360.

GetUserProfile –> Fetch the UserProfile of the group/user been configured.

These methods were then optimized for caching in the next BizTalk360 version release and hence the performance issue was resolved.

BizTalk360 subservices status

As mentioned before, we have different subservices running for BizTalk360 Monitor and Analytics services. In case, if there is any problem in receiving alerts or if the service is not running, the first step would be to check for the status of the monitoring subservices for any exceptions. This can be found in BizTalk360 Settings -> BizTalk360 Health -> Monitoring Service Status. The complete information will also be captured in the logs.

Self-Troubleshooting tools in BizTalk360

Similarly, we have the check for the Analytics sub services under Settings -> Analytics Health.

What if the customer has configured BizTalk360 under High Availability(HA)? High Availability is the scenario where BizTalk360 is installed on more than one servers pointing to the same database. The BizTalk360 Monitor and Analytics services can also be configured for HA. So, when there is an issue reported with these services, the logs from the active server must be investigated. The active server can be identified from BizTalk360 Settings -> BizTalk360 Health -> High Availability Status.

Self-Troubleshooting tools in BizTalk360

Conclusion:

These basic troubleshooting tools available in BizTalk360 make our support a little easier in resolving the customer issues. The first step analysis can be done with these tools which help us identify the root cause of the problem. We have our latest release BizTalk360 v8.6 coming up in a few weeks with more exciting features. In case of further queries, you can write to us at support@biztalk360.com.

Author: Praveena Jayanarayanan

I am working as Senior Support Engineer at BizTalk360. I always believe in team work leading to success because “We all cannot do everything or solve every issue. ‘It’s impossible’. However, if we each simply do our part, make our own contribution, regardless of how small we may think it is…. together it adds up and great things get accomplished.”

Microsoft Integration Weekly Update: Nov 13, 2017

Microsoft Integration Weekly Update: Nov 13, 2017

Do you feel difficult to keep up to date on all the frequent updates and announcements in the Microsoft Integration platform?

Integration weekly update can be your solution. It’s a weekly update on the topics related to Integration – enterprise integration, robust & scalable messaging capabilities and Citizen Integration capabilities empowered by Microsoft platform to deliver value to the business.

If you want to receive these updates weekly, then don’t forget to Subscribe!

Feedback

Hope this would be helpful. Please feel free to let me know your feedback on the Integration weekly series.

Advertisements

BizTalk Server: Creation of Adapter WCF-SQL Configuration Store entries failed. Access denied.

BizTalk Server: Creation of Adapter WCF-SQL Configuration Store entries failed. Access denied.

This week while configuring and optimizing a brand-new BizTalk Server 2016 environment we got the following error message while trying to configure register the WCF-SQL Adapter in the BizTalk Server Administration console:

Creation of Adapter WCF-SQL Configuration Store entries failed. Access denied. See the event log (on computer ‘SQL-SERVER’) for more details.

Creation of Adapter WCF-SQL Configuration Store entries failed

(sorry the picture quality, it was taken with my cell phone)

Despite I was a member of BizTalk Administration group, I didn’t have remote access to the SQL Server machine that was managed by another team so I couldn’t go there to check it out. Nevertheless, I reach that team (SQL and sysadmins) already with a possible solution that it turned out to be correct.

Cause

Many of the times these types of issues indicate or lead us to believe that there are problems associated with MSDTC. Or is not properly configured, Windows Firewall may be blocking DTC communications or in HA environment’s SSO is not clustered and may be offline.

All these possibilities should be investigated. However, if any of the points mentioned above were, for this particular case, a probable cause for this problem, it should have already manifested itself when the team pre-installed the environment and they did install the environment without encountering any problems.

The only difference between the installation and now my configuration was that these tasks were made by different users!

It is important to mention that, the user that is trying to registering an Adapter using the BizTalk Server Administration Console, need to have permissions to the SSO Database in order to register its properties so that he can store and retrieve the properties at design time and runtime.

And that is one of the reasons for why the “BizTalk Server Administrators” group should be a member of the “SSO Administrators” group.

BizTalk administrations are responsible for configuring all the components of BizTalk and many of them need to interact with SSO Database.

The people/team that was responsible to install BizTalk Server, they were members of BizTalk Server Administration, SSO Administration and some of them System Administrations and that was the reason why they didn’t get this problem or similar problems. The reason for the problem I faced was because:

  • My user was a member of BizTalk Server Administrators and local admin only. But the BizTalk Server Administrators wasn’t member of SSO Administration group.

Solution

To solve this problem, you may have two options:

  • Add my user to the SSO Administrators group.
    • Not recommended because in my opinion is more difficult to manage user access rights if you add them to each individual group.
  • Or add the “BizTalk Server Administrators” as a member of the “SSO Administrators” group.

After my user or the “BizTalk Server Administrators” group was added as a member of the “SSO Administrators” group, I was able to register the adapter.

Note: this problem can happen with any adapter you are trying to register.

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc. He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.

Learn How to Build Production Ready Azure Logic Apps in my new Pluralsight Course!

Learn How to Build Production Ready Azure Logic Apps in my new Pluralsight Course!

Interested in learning more about Azure Logic Apps?  What are they used for?  What business problems they can solve?

Take a look at my new course “Azure Logic Apps: Getting Started” available on Pluralsight that offers training on working with and creating Azure Logic Apps.

It is a quick 1 hour 18 minutes overview of the basics of Logic Apps.  Content is broken down into 3 modules.

  • Introduction to Microsoft Azure Logic Apps
  • Design and Development of Logic Apps
  • Building a Production Ready Logic App

If you are short on time, you can watch Pluralsight content in up to 2x speed!

Give the course a try and I look forward to any feedback!

You can view the course here.

INTEGRATE 2017 USA – In Conversation

INTEGRATE 2017 USA – In Conversation

INTEGRATE is a global annual conference organized by BizTalk360 for people working in the Microsoft Integration space. It is held annually in London and this year it also took place at the Microsoft Campus in Redmond, USA between 25-27 October.

Here’s a short Tête-à-tête talk between Duncan Barker and Bhavana Nambiar on their experiences at INTEGRATE 2017 USA.

Duncan – This is my first Integrate event in the US and I received a lot of good feedback and had some very interesting conversations, did it meet your expectations, Bhavana?

Bhavana – Duncan, it was incredible! It was great it all came together after a lot of hard work. I had certain expectations and it exceeded all of them. We reached out to a wide range of participants from the Microsoft Product Group to Partners and from Consultants to End Users during this event and we were also able to touch so many Industries & Sectors, such as Healthcare, Utilities, Retail, Defence & Space, Paper Products, Forestry, Finance, Insurance, Oil & Gas, Food, Wine & Spirits to name a few. Talk about global reach…people traveled from 17 different countries to attend this event. Above all, this is the most satisfying part of organizing this event in what they call the ‘Technological Mecca’ – the Microsoft Headquarters in Redmond.

Bhavana – Duncan, you mentioned interesting conversations – what were they?

Duncan – To start with, I was so pleased to be able to introduce some of our attendees to the Microsoft Product Group. It isn’t very often they get the chance to speak to the very people who are shaping the future in Integration. Furthermore, such conversations are invaluable for the Microsoft Team to speak directly in person to those using their software.
Other conversations I refer to, are those between Consultants and end users. In face to face meetings Consultants could respond to questions about the challenges end users face in their day to day operations.

Integrate 2017 USA was a great possibility for me to vent some of the challenges and questions we have been facing when creating Azure solutions for our clients. I was able to have very beneficial discussions with the Microsoft Product Group on these items, so besides all the great knowledge I take home with me from discussions and presentations, I also bring home new relationships that will help my team and I in the future.

And importantly for BizTalk360, I met and spoke to some of our users and to those who are evaluating our products for use in forthcoming projects.
I saw a lot of these conversations happen outside the conference room and at the informal dinner on Monday evening. I can only imagine this exceeded your expectations. Did you think it would work as well as this?

Bhavana – When we were planning this event, we wanted to give plenty of opportunity to the attendees to network with people from the same community. That was also the idea behind the Evening Dinner which we arranged in the Microsoft Commons. I was quite pleased with the turnout and loved the space it gave for the attendees to mingle with the Product Group and each other.

Duncan, would you believe it if I said we were also instrumental in some of the reunions that happened during this event? Two of the attendees met each other for the first time in 10 years during Integrate 2017 USA. And Saravana mentioned a few people he met during the event who were colleagues from his previous job 9 years ago, before BizTalk360 was born.
I was also overwhelmed to see how people with different nationalities came together and they all spoke one language – ‘Microsoft Technologies’.

Bhavana – I think we always need to make certain things better for the next event; so tell me about the challenges you faced while networking?

Duncan – I wanted to meet as many people as possible and to engage in quality conversations about their roles in the Integration space. For the first time in many cases, I could put a face to an email contact and have a personal discussion. Everyone was very friendly and wanted to talk about their BizTalk experiences – being quite new to the industry, it gave me great insight into how companies manage their integrations. But with my poor eyesight, one challenge was trying to read the name badges without staring at people’s waistlines! Next time we should make the names and company names bolder and reduce the size of the lanyard so it hangs a bit higher!

Bhavana – Duncan, I remember your comment after Integrate 2017 in London and I did take it on board and ordered a different lanyard this time around but it is unbelievable how people with IT skills could not work out how to use a lanyard! We will need to do more brainstorming and research on this topic.

Bhavana – What did you think of the Event Venue?

Duncan – At Integrate 2017 in London, some felt trapped in the auditorium without the ability to come and go freely from time to time. In my view, the Microsoft Campus facility gave the best of both worlds –a freer space to listen to the addresses and move around without disturbing the speakers – the audio visual worked very well with the 3 huge screens, back and front. However, some of the speakers missed the theatrical spotlight of the London venue and requested the rock ’n roll intros of London!

Duncan – As the organizer, how did you manage to co-ordinate the speakers and the content of their speeches?

Bhavana – I think all the credit goes to Saravana in liaising with the Product Team and the MVP’s in bringing the right content to our attendees. One of the main reasons for the event’s success is the quality of the content presented in these sessions and I think we were spot on. It was a good mix with sessions focusing on all the main technologies such as BizTalk, LogicApps, API Management, Messaging, Microsoft Flow etc.

Bhavana – Being a Business Development person, trying to engage with more and more Partners & Consultants, how did this event really help you?

Duncan– Two Partners, VNB Consulting and Devscope sponsored the event and several more traveled to the event, some travelling half way round the world. This commitment to attend proves the benefit for them. The success of Integrate 2017 USA has prompted some Partners present in Redmond to express an interest in sponsoring the next event. So, for me it was a great opportunity to listen to what our Partners require to meet the needs of their clients, and to hear more about what they want out of the Integrate events and BizTalk360 as an ISV. I hope our announcements about product improvements to BizTalk360 and ServiceBus360 plus the unveiling of our new product, Atomic Scope, have demonstrated we are a company to work closely with.

Duncan – We haven’t mentioned the superstar of the conference. What was the reaction to Scott Guthrie’s keynote?

Bhavana – It was a great privilege to have Scott Guthrie do the keynote. It did add star quality to the event and the attendees were really excited by his presence. Personally speaking, I was awestruck and remember how we cautiously approached Scott’s assistant in the hope of having a photograph taken with him and ta-da here is the result!

Integrate 2017 USA organizers

I must say it did start a trend as everyone jumped up to take their selfies with Scott. A big thanks to Jim for his efforts in getting Scott on board. Without him this wouldn’t have been possible.

Bhavana – So if I can ask you, what did you take away from Integrate 2017 USA?

Duncan – As in London in June, my first impression was one of community. Despite the different roles of attendees, we all have one thing in common – to move forward with and get the best out of Microsoft Technologies. The collaboration amongst everyone was first class. Secondly, it was a super opportunity to meet people in key decision making positions, meeting influential people who are shaping the way their companies are run is exactly what I want.

Finally, I would like to thank everyone I got to know at Integrate. I will speak to and meet many of the attendees again, perhaps at the next Integrate event!

I have been asked by many at the event what the final statistics looked like – do you have the final numbers?

Bhavana – During the 3 day event there were 25 Speakers presenting 24 sessions to 222 Attendees from 17 countries.

All in all, a very successful event. Thanks to you, Saravana, Gowri, Sriram and Parthiban who all worked behind the scenes to make this event a success. A big shout to all the attendees, Product Group, MVP’s, sponsors and the community in backing us all the way.

Our hunt for a new event venue for Integrate 2018 in London starts now ……

Attendee Testimonials

Integrate is the must attend event if you are involved in the Microsoft Integration space. There is no better place to learn and communicate with your peers and the Microsoft Product Groups.

A very informative event with excellent speakers and relevant content. Highly recommended.

Thank you for a great conference. Very informative and good speakers, and very interesting and rightly balanced sessions.

Eye opening event, extremely useful.

This event gave me the answers to the direction I should be directing my staff to stay current on technology.

The presentations were really helpful at enabling me to identify a well-reasoned and justifiable path forward. I was feeling quite frustrated before the conference.

Author: Bhavana Nambiar

Bhavana makes sure our customers and team are well taken care of: license keys, payroll, benefits, taxes, accounting, dealing with the bank — she does it all!.

Microsoft Integration Weekly Update: Nov 6, 2017

Microsoft Integration Weekly Update: Nov 6, 2017

Do you feel difficult to keep up to date on all the frequent updates and announcements in the Microsoft Integration platform?

Integration weekly update can be your solution. It’s a weekly update on the topics related to Integration – enterprise integration, robust & scalable messaging capabilities and Citizen Integration capabilities empowered by Microsoft platform to deliver value to the business.

If you want to receive these updates weekly, then don’t forget to Subscribe!

Cloud Shell  via The Azure podcast

Feedback

Hope this would be helpful. Please feel free to let me know your feedback on the Integration weekly series.

Advertisements

Stef’s Monthly Update – October 2017

Stef’s Monthly Update – October 2017

The first month at Codit went faster than I expected. I traveled a lot this past month. A few times to Switzerland where I work for a client, London to run the Royal Parks half marathon, Amsterdam the week after to run another, and finally to Seattle/Redmond for Integrate US.

Month October

October was an exciting month with numerous events. First of all, on the 9th of October, I spoke at Codit’s Connect event in Utrecht on the various integration models. Moreover, on that day I was joined by other great speakers like Tom, Richard, Glenn, Sam, Jon, and Clemens. This was the first full day event by Codit on the latest developments in hybrid and cloud integration and around integration concepts shared with the Internet of Things and Azure technology.

A new challenge I accepted this month was writing for InfoQ. Richard approached me if I wanted to write about cloud technology-related topics. So far two articles are available:

It was not easy writing article’s in a more journalistic style, which meant being objective, research the news and creating a solid story in 400 to 500 words.

Middleware Friday

Kent and I continued our Middleware Friday episodes in October. Cosmos DB, Microsoft’s globally distributed, multi-model database, offers integration capabilities with new binding in Azure Functions.

The evolution of Logic Apps continues with the ability to build your own connectors.

Integrate US

The 20th of October I flew over the Atlantic Ocean to Seattle to meet up with Tom and JoAnn. We did a nice micro-brewery tour on the next day.

Sunday that weekend we enjoyed seeing the Seahawks play against New-York Giants. After the weekend it was time to prepare for Integrate US 2017. Finally, you can read the following recaps from the BizTalk360 blog:

The recaps were written by Martin, Eldert and myself.

To conclude Integrate US was a great success and well organized again by Team BizTalk360.

Before I went home I spent another weekend in Seattle to enjoy some more American football. On Saturday Kent and I went to see the Washington Huskies play UCLA.

On Sunday we watch Seattle play the Texans a very close game. After the game, we recorded a Middleware Friday in out Seahawks outfit.

Music

My favorite albums in October were:

  • Trivium – The Sin And The Sentence
  • August Burns Red – Phantom Anthem
  • Enslaved – E

It was a busy month and next month will be no different with traveling and the next speaking engagements DynamicsHub and CloudBrew.

Cheers,

Steef-Jan

Author: Steef-Jan Wiggers

Steef-Jan Wiggers is all in on Microsoft Azure, Integration, and Data Science. He has over 15 years’ experience in a wide variety of scenarios such as custom .NET solution development, overseeing large enterprise integrations, building web services, managing projects, designing web services, experimenting with data, SQL Server database administration, and consulting. Steef-Jan loves challenges in the Microsoft playing field combining it with his domain knowledge in energy, utility, banking, insurance, health care, agriculture, (local) government, bio-sciences, retail, travel and logistics. He is very active in the community as a blogger, TechNet Wiki author, book author, and global public speaker. For these efforts, Microsoft has recognized him a Microsoft MVP for the past 7 years.

Trying Out Microsoft’s Spring Boot Starters

Trying Out Microsoft’s Spring Boot Starters

Do you build Java apps? If so, there’s a good chance you’re using Spring Boot. Web apps, event-driven apps, data processing apps, you name it, Spring Boot has libraries that help. It’s pretty great. I’m biased; I work for the company that maintains Spring, and taught two Pluralsight courses about Spring Cloud. But there’s no denying the momentum:

If a platform matters, it works with Spring Boot. Add Microsoft Azure to the list. Microsoft and Pivotal engineers created some Spring Boot “starters” for key Azure services. Starters make it simple to add jars to your classpath. Then, Spring Boot handles the messy dependency management for you. And with built-in auto-configuration, objects get instantiated and configured automatically. Let’s see this in action. I built a simple Spring Boot app that uses these starters to interact with Azure Storage and Azure DocumentDB (CosmosDB). 

I started at Josh Long’s second favorite place on the Internet: start.spring.io. Here, you can bootstrap a new project with all sorts of interesting dependencies, including Azure! I defined my app’s group and artifact IDs, and then chose three dependencies: web, Azure Storage, and Azure Support. “Azure Storage” brings the jars in for storage, and “Azure Support” activates other Azure services when you reference their jars.

2017.11.02-boot-01

I downloaded the resulting project and opened it in Spring Tool Suite. Then I added one new starter to my Maven POM file:

<dependency>
        <groupId>com.microsoft.azure</groupId>
        <artifactId>azure-documentdb-spring-boot-starter</artifactId>
</dependency>

That’s it. From these starters, Spring Boot pulls in everything our app needs. Next, it was time for code. This basic REST service serves up product recommendations. I wanted to store each request for recommendations as a log file in Azure Storage and a record in DocumentDB. I first modeled a “recommendations” that goes into DocumentDB. Notice the topmost annotation and reference to a collection.

package seroter.demo.bootazurewebapp;
import com.microsoft.azure.spring.data.documentdb.core.mapping.Document;

@Document(collection="items")
public class RecommendationItem {

        private String recId;
        private String cartId;
        private String recommendedProduct;
        private String recommendationDate;
        public String getCartId() {
                return cartId;
        }
        public void setCartId(String cartId) {
                this.cartId = cartId;
        }
        public String getRecommendedProduct() {
                return recommendedProduct;
        }
        public void setRecommendedProduct(String recommendedProduct) {
                this.recommendedProduct = recommendedProduct;
        }
        public String getRecommendationDate() {
                return recommendationDate;
        }
        public void setRecommendationDate(String recommendationDate) {
                this.recommendationDate = recommendationDate;
        }
        public String getRecId() {
                return recId;
        }
        public void setRecId(String recId) {
                this.recId = recId;
        }
}

Next I defined an interface that extends DocumentDbRepository.

package seroter.demo.bootazurewebapp;
import org.springframework.stereotype.Repository;
import com.microsoft.azure.spring.data.documentdb.repository.DocumentDbRepository;
import seroter.demo.bootazurewebapp.RecommendationItem;

@Repository
public interface RecommendationRepo extends DocumentDbRepository<RecommendationItem, String> {
}

Finally, I build the REST handler that talks to Azure Storage and Azure DocumentDB.  Note a few things. First, I have a pair of autowired variables. These reference beans created by Spring Boot and injected at runtime. In my case, they should be objects that are already authenticated with Azure and ready to go.

@RestController
@SpringBootApplication
public class BootAzureWebAppApplication {
        public static void main(String[] args) {
          SpringApplication.run(BootAzureWebAppApplication.class, args);
        }

        //for blob storage
        @Autowired
        private CloudStorageAccount account;

        //for Cosmos DB
        @Autowired
        private RecommendationRepo repo;

In the method that actually handles the HTTP POST, I first referenced the Azure Storage Blob containers and add a file there. I got to use the autowired CloudStorageAccount here. Next, I created a RecommendationItem object and loaded it into the autowired DocumentDB repo. Finally, I returned a message to the caller.

@RequestMapping(value="/recommendations", method=RequestMethod.POST)
public String GetRecommendedProduct(@RequestParam("cartId") String cartId) throws URISyntaxException, StorageException, IOException {

        //create log file and upload to an Azure Storage Blob
        CloudBlobClient client = account.createCloudBlobClient();
        CloudBlobContainer container = client.getContainerReference("logs");
        container.createIfNotExists();

        String id = UUID.randomUUID().toString();
        String logId = String.format("log - %s.txt", id);
        CloudBlockBlob blob = container.getBlockBlobReference(logId);
        //create the log file and populate with cart id
        blob.uploadText(cartId);

        //add to DocumentDB collection (doesn't have to exist already)
        RecommendationItem r = new RecommendationItem();
        r.setRecId(id);
        r.setCartId(cartId);
        r.setRecommendedProduct("Y777-TF2001");
        r.setRecommendationDate(new Date().toString());
        repo.save(r);

        return "Happy Fun Ball (Y777-TF2001)";
}

Excellent. Next up, creating the actual Azure services! From the Azure Portal, I created a new Resource Group called “boot-demos.” This holds all the assets related to this effort. I then added an Azure Storage account to hold my blobs.

2017.11.02-boot-02

Next, I grabbed the connection string to my storage account.

2017.11.02-boot-03

I took that value, and added it to the application.properties file in my Spring Boot app.

azure.storage.connection-string=DefaultEndpointsProtocol=https;AccountName=bootapplogs;AccountKey=[KEY VALUE]<span                           data-mce-type="bookmark"                              id="mce_SELREST_start"                                data-mce-style="overflow:hidden;line-height:0"                                style="overflow:hidden;line-height:0"                         ></span>;EndpointSuffix=core.windows.net

Since I’m also using DocumentDB (part of CosmosDB), I needed an instance of that as well.

2017.11.02-boot-04

Can you guess what’s next? Yes, it’s credentials. Specifically, I needed the URI and primary key associated with my Cosmos DB account.

2017.11.02-boot-05

I snagged those values and also put them into my application.properties file.

azure.documentdb.database=recommendations
azure.documentdb.key=[KEY VALUE]
azure.documentdb.uri=https://bootappdocs.documents.azure.com:443/

That’s it. Those credentials get used when activating the Azure beans, and my code gets access to pre-configured objects. After starting up the app, I sent in a POST request.

2017.11.02-boot-06

I got back a “recommended product”, but more importantly, I didn’t get an error! When I looked back at the Azure Portal, I saw two things. First, I saw a new log file in my newly created blob container.

2017.11.02-boot-07

Secondly, I saw a new database and document in my Cosmos DB account.

2017.11.02-boot-08

That was easy. Spring Boot apps, consuming Microsoft Azure services with no fuss.

Note that I let my app automatically create the Blob container and DocumentDB database. In real life you might want to create those ahead of time in order to set various properties and not rely on default values.

Bonus Demo – Running this app in Cloud Foundry

Let’s not stop there. While the above process was simple, it can be simpler. What if I don’t want to go to Azure to pre-provision resources? And what if I don’t want to manage credentials in my application itself? Fret not. That’s where the Service Broker comes in.

Microsoft created an Azure Service Broker for Cloud Foundry that takes care of provisioning resources and attaching those resources to apps. I added that Service Broker to my Pivotal Web Services (hosted Cloud Foundry) account.

2017.11.02-boot-09

When creating a service instance via the Broker, I needed to provide a few parameters in a JSON file. For the Azure Storage account, it’s just the (existing or new) resource group, account name, location, and type.

{
  "resourceGroup": "generated-boot-demo",
  "storageAccountName": "generatedbootapplogs",
  "location": "westus",
  "accountType": "Standard_LRS"
}

For DocumentDB, my JSON file called out the resource group, account name, database name, and location.

{
  "resourceGroup": "generated-boot-demo",
  "docDbAccountName": "generatedbootappdocs",
  "docDbName": "recommendations",
  "location": "westus"
}

Sweet. Now to create the services. It’s just a single command for each service.

cf create-service azure-documentdb standard bootdocdb -c broker-documentdb-config.json

cf create-service azure-storage standard bootstorage -c broker-storage-config.json<span                              data-mce-type="bookmark"                              id="mce_SELREST_start"                                data-mce-style="overflow:hidden;line-height:0"                                style="overflow:hidden;line-height:0"                         ></span>

To prove it worked, I snuck a peek at the Azure Portal, and saw my two new accounts.

2017.11.02-boot-10

Finally, I removed all the credentials from the application.properties file, packaged my app into a jar file, and added a Cloud Foundry manifest. This manifest tells Cloud Foundry where to find the deployable asset, and which service(s) to attach to. Note that I’m referencing the ones I just created.

---
applications:
- name: boot-azure-web-app
  memory: 1G
  instances: 1
  path: target/boot-azure-web-app-0.0.1-SNAPSHOT.jar
  services:
  - bootdocdb
  - bootstorage

With that, I ran a “cf push” and the app was deployed and started up by Cloud Foundry. I saw that it was successfully bound to each service, and the credentials for each Azure service were added to the environment variables. What’s awesome is that the Azure Spring Boot Starters know how to read these environment variables. No more credentials in my application package. My environments variables for this app in Cloud Foundry are shown here.

2017.11.02-boot-11

I called my service running in Cloud Foundry, and as before, I got a log file in Blob storage and a document in Document DB.

These Spring Boot Starters offer a great way to add Azure services to your apps. They work like any other Spring Boot Starter, and also have handy Cloud Foundry helpers to make deployment of those apps super easy. Keep an eye on Microsoft’s GitHub repo for these starters. More good stuff coming.

Advertisements



Categories: Cloud, Cloud Foundry, Microsoft Azure, Spring