How to reset the built-in administrator account password from an Azure BizTalk VM.

How to reset the built-in administrator account password from an Azure BizTalk VM.

Are you careless like me who constantly forgets the credentials (not critical)? If so this post about how you can reset the built-in administrator account password on an Azure BizTalk VM will help you not to waste a few hours in vain.

I think that the life of a consultant is difficult regarding system credentials, you have your personal credentials (AD account, company and personal emails, and so on) and for each client, you may also have different credential accounts… If this were not already complicated, each system may have different criteria that require you to have more complicated or simpler passwords. For these cases, the solution is more or less simple, we annoy the sysadmin and ask him to reset our password. However, for our Azure VM that we create in our Azure Subscription for demos or production, well that can be a more complicated scenario and I have tendency to create and delete several BizTalk Server Developer Edition machines, special for POC, Workshops or courses and sometimes (aka all the time) I forgot the built-in administrator password. So, how can we reset the built-in administrator account password from an Azure BizTalk Server VM?

Note: I am referring Azure BizTalk Server VM, mainly because my blog is about Enterprise Integration but this can apply to any type of Azure VM.

Most of the time the answer is very simple:

  • You access the Azure portal (https://portal.azure.com/) and select the Virtual Machine that you want to reset the password
  • Then click Support + Troubleshooting > Reset password. The password reset blade is displayed:

Reset the built-in administrator account password: Azure Portal BizTalk VM

  • You just need to enter the new password, then click Update. And you will see a message on the right upper corner saying the reset password task is processing.

Reset the built-in administrator account password: Azure Portal BizTalk VM task begin

  • The result of the task will be presented in the Notification panel and most of the times you will find a “Successfully reset password” message

Reset the built-in administrator account password: Azure Portal BizTalk VM task complete

But there is always “an exception to the rule”, and that was one of my cases. When I was trying to reset the password through the Azure Portal I was always getting an Unable to reset the password message, don’t know exactly why to be honest. I tried to reset the password by using PowerShell described in the documentation: How to reset the Remote Desktop service or its login password in a Windows VM

Set-AzureRmVMAccessExtension -ResourceGroupName "myResoureGroup" -VMName "myVM" -Name "myVMAccess" -Location WestUS -typeHandlerVersion "2.0" -ForceRerun

But still I was not able to perform this operation and I was getting this annoying error:

…Multiple VMExtensions per handler not supported for OS type ‘Windows’. VMExtension. VMExtension ‘…’ with handler ‘Microsoft.Compute.VMAccessAgent’ already added or specified in input.”

Solution

To solve this problem, I was forced to remove the existing VMExtention by:

  • First by getting the list of extensions on VM to find the name of the existing extension (presented in red on the below picture)
Get-AzureRmVM -ResourceGroupName [RES_GRP_NAME] -VMName [VM_NAME] -Status

Reset the built-in administrator account password: Azure Portal BizTalk VM PowerShell Get-AzureRmVM

  • And then by removing the VMAccess extension from the virtual machine
Remove-AzureRmVMAccessExtension -ResourceGroupName [RES_GRP_NAME] -VMName [VM_NAME] -Name [EXT_NAME]
  • You will get a validation question. “This cmdlet will remove the specified virtual machine extension. Do you want to continue?”, Type “y” and then ENTER to accept and continue.

Reset the built-in administrator account password: Azure Portal BizTalk VM PowerShell Remove-AzureRmVMAccessExtension

After that you can access to the Azure portal (https://portal.azure.com/), select the Virtual Machine that you want to reset the password, click Support + Troubleshooting > Reset password and update the built-in administrator account password without the previous problem.

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc. He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.

Announcing: Preview – BizTalk Server Accelerator for SWIFT: Message Pack 2017

In order to provide industry-standard compliance with the SWIFT 2017 Standards MT release 2017, Microsoft® is offering, to customer’s with Software Assurance, updates to the flat-file (MT) messaging schemas used with the Microsoft BizTalk Accelerator for SWIFT.

The A4SWIFT Message Pack 2017 contains the following:
Re-packaging of all SWIFT FIN message types and business rules
Updates to schemas and business rules for compliance with SWIFT 2017 requirements
Re-structuring of  FIN response reconciliation (FRR) schemas
Please refer to the documentation available at the download link for more details.

For customer’s with Microsoft Software Assurance, this message pack can be used with BizTalk Server 2013, 2013R2 and 2016. You can download the production-ready preview now, from the Microsoft Download Center, at link.

BizTalk Server Team

Why you should attend INTEGRATE 2017 (USA)?

Why you should attend INTEGRATE 2017 (USA)?

Are you an Integration expert? Want to get up to speed on the Microsoft Integration technologies? Want to hear what the Microsoft Product Group is up to, their vision and road map? Missed INTEGRATE 2017 London edition? Then INTEGRATE 2017 USA is the answer to all these questions.

BizTalk360 and ServiceBus360 are thrilled to partner with Microsoft and present INTEGRATE 2017 USA on October 25-27, 2017 at Microsoft Redmond Campus, WA. INTEGRATE 2017.

Here’s what Jim Harrer, Pro Integration Group PM at Microsoft and Saravana Kumar, Founder/CTO BizTalk360 have to say about the INTEGRATE 2017 USA event.

Why you should attend INTEGRATE 2017?

In today’s world, integration has become crucial in the success of any organization. Gone are the days where individual monolithic applications solved our big problems like CRM, ERP, etc. Today, the applications are connected, not just on-premise but it extends to the cloud based SaaS products like Salesforce, Workday, Dynamics 365 etc.

10 years ago the world of integration was very small. It was just BizTalk Server, WCF and few end points like FTP, File, SMTP etc. However, today the landscape is bigger and more complex.

In the past 5 years or so, Microsoft has invested significantly in various technology stacks both on-premise and in the cloud realizing the challenges what companies are facing making this connected systems work together.

It’s important to understand how various technologies join together to provide a consolidated platform. Today if you are doing integration on Microsoft stack, you need to be aware of at least these following technologies

  • Microsoft BizTalk Server
  • Azure Logic Apps
  • Azure Service Bus Messaging
  • Azure Relays
  • Azure EventHubs
  • Azure Event Grid
  • Azure Stream Analytics
  • Azure API Management
  • Azure Functions
  • Azure Application Insights/Log Analytics, and
  • Third party products like BizTalk360 & ServiceBus360

So where can you learn deep dive about all these technologies directly from the people who had built these technologies?

“INTEGRATE 2017 (USA) is the only option”

There is no other option for you to learn deep dive about all these technologies within a short period of time (3 days intense). If you are confused in the phase at which things are moving and need to get clarity on the over all road map and direction the Microsoft Integration world is moving, you need to be present at INTEGRATE 2017.

We are also pushing hard from our end to make people understand the importance of Integration in Microsoft stack and this event is pretty much organized more on community spirit. If you are attending a 1-week instructor led training or any technology conference (ex: Ignite, Inspire) that spans for 4 days, the typical cost will be around $2500 to $5000. However, we are organizing INTEGRATE 2017 for $599.

More than that, the generic technology conferences like Ignite will have session covering a wide range of technologies and you’ll hardly find sessions here and there talking about INTEGRATE, whereas INTEGRATE 2017 is purely focused on one-and-only INTEGRATION.

Event Details

  • October 25-27, 2017
  • Building 92, Microsoft Campus, Redmond WA
  • 25 Sessions
  • 30 Speakers (Microsoft Product Group & Microsoft MVP’s)

This is our second global event this year. We are simply repeating the success we had in INTEGRATE 2017 (London) this year. There were close 400 attendees from 50+ countries across Europe who attended the event.  You can get the glimpse of the event watch the videos here INTEGRATE 2017 (London) Videos & Presentations

Are you still not convinced? 🙂 Don’t miss out, register today and take the early bird offer.

Keynote & Sessions

We are delighted to announce Scott Guthrie, Executive Vice President at Microsoft will be delivering the Keynote speech on October 25. Scott’s presence in the event simply signifies the importance of Microsoft Integration technologies in Azure and On-Premise. Scott will be delivering the keynote addressing the Microsoft Vision and Roadmap for Integration.

INTEGRATE 2017 USA Keynote

You can find the speaker list and the agenda on the event website https://www.biztalk360.com/integrate-2017-usa/.

Pricing

We already opened registrations for INTEGRATE 2017 USA event. The early bird registrations for tickets closes on August 31st (which is just about 15 days away!). The pricing model for the event is pretty simple as shown below. We have a special offer to avail a discount of $100 (on each ticket) if you book 2 or more tickets. So what are you waiting for? Register for the event here.

INTEGRATE 2017 USA Pricing

Sponsorship

We are also opening up sponsorship opportunities for this event. There are sponsorship packages available at different levels. If you are interested to sponsor this event, please contact us at contact@biztalk360.com with the subject line “INTEGRATE 2017 USA – SPONSOR DETAIL”.

We hope to see you at INTEGRATE 2017 (USA)!

Author: Saravana Kumar

Saravana Kumar is the Founder and CTO of BizTalk360, an enterprise software that acts as an all-in-one solution for better administration, operation, support and monitoring of Microsoft BizTalk Server environments.

Consume Adapter Service option is missing from Add Generated Items in Visual Studio

Consume Adapter Service option is missing from Add Generated Items in Visual Studio

The Consume Adapter Service option from “Add Generated Items…” inside Visual Studio is metadata generation tool (or add-in), included in WCF LOB Adapter SDK, that can be used with BizTalk Projects to allow BizTalk developers to search or browse metadata from LOB adapters and then generate XML Schemas for selected operations inside Visual Studio.

This is a simple sample of the Consume Adapter Service tool to generate XML Schemas for SQL Server operations:

Consume Adapter Service tool SQL Server sample

However, recently while I was working in a client development environment with the LOB adapters installed and configured in the environment I notice that the Consume Adapter Service option was missing from “Add Generated Items…” window in Visual Studio

Consume Adapter Service option missing from Visual Studio

Cause

In our case, indeed we had the LOB adapters installed and configured in the environment, however, we only had the runtime of the WCF LOB Adapter SDK installed, in other words, we didn’t have the WCF LOB Adapter SDK fully installed.

The Consume Adapter Service tool will only be available in your Visual Studio if you install the Tools options from the WCF LOB Adapter SDK. This option will include the Adapter Code Generation Wizard and Visual Studio Addin Components.

Consume Adapter Service: WCF LOB Adapter SDK Tools option

Note: Personally, I recommend that you perform a full installation (all components) of the WCF LOB Adapter SDK on BizTalk Server Development environments.

Solution

The solution it is easy for this particular case, you just need to install the WCF LOB Adapter SDK Tools.

To install the WCF LOB Adapter SDK Tools you need:

  • Close any programs you have open. Run the BizTalk Server <version> installer as Administrator.
  • On the Start page, click “Install Microsoft BizTalk Adapters”
  • In the BizTalk Adapter Pack Start page, select the first step “Step 1. Install Microsoft WCF LOB Adapter SDK”. An installer of SDK is launched.
    • On the “Welcome to the Windows Communication Foundation LOB adapter SDK Setup Wizard” page, click “Next”
    • On the “Change, repair, or remove installation” page, select the “Change” option

 Consume Adapter Service: Change WCF LOB Adapter SDK installation

    • On the “Custom Setup” page, make sure that you select the option “Tools” to be installed and click “Next”

Consume Adapter Service: WCF LOB Adapter SDK Tools option

Note: Again, I personally recommend that you perform a full installation (all components) of the WCF LOB Adapter SDK on BizTalk Server Development environments.

    • On the “Ready to change Windows Communication Foundation LOB Adapter SDK Setup” page, click “Change” to begin the installation

Consume Adapter Service: confirm change WCF LOB Adapter SDK installation

After the installation is finished, if you open your BizTalk project solution once again in Visual Studio, you will see that the Consume Adapter Service option will now be available in the “Add Generate Items” window:

  • In Visual Studio, in the Project pane, right-click your BizTalk Server project, and then choose Add Add Generated Items… | Consume Adapter Service.

Consume Adapter Service present in Visual Studio

This problem can happen and the solution is the same for all BizTalk Versions (that contains LOB Adapters).

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc. He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.

Building reactive, event driven solutions with the new Azure Event Grid Service

Building reactive, event driven solutions with the new Azure Event Grid Service

Microsoft has released yet another service in its Azure Platform named Event Grid. This enables you to build reactive, event driven applications around this service routing capabilities. You can receive events from multiple source or have events pushed (fan out) to multiple destinations as the picture below shows.

New possible solutions with Event Grid

With this new service there are some nifty serverless solution architectures possible, where this service has its role and value. For instance you can run image analysis on let’s say a picture of someone is being added to blob storage. The event, a new picture to blob storage can be pushed as an event to Event Grid, where a function or Logic App can handle the event by picking up the image from the blob storage and sent it to a Cognitive Service API face API. See the diagram below.

Another solution could involve creating an Event Topic for which you can push a workload to and an Azure function, or Logic App or both can process it. See the diagram below.

And finally the Event Grid offers professional working on operation side of Azure to make their work more efficient when automating deployments of Azure services. For instance a notification is send once one of the Azure services is ready. Let’s say once a Cosmos DB instance is ready a notification needs to be sent.

The last sample solution is something we will build using Event Grid, based on the only walkthrough provided in the documentation.

Sent notification when Cosmos DB is provisioned

To have a notification send to you by email once an Azure Service is created a Logic App is triggered by an event (raised once the service is created in a certain resource group). The Logic App triggered by the event will act upon it by sending an email. The trigger and action are the Logic and it’s easy to implement this. And the Logic App is subscribing to the event within the resource group when a new Azure Service is ready.

Building a Logic App is straight forward and once provisioned you can choose a blank template. Subsequently, you add a trigger, for our solution it’s the event grid once a resource is created (the only available action trigger currently).

The second step is adding a condition to check the event in the body. In this condition in advanced mode I created : @equals(triggerBody()?[‘data’][‘operationName’], ‘Microsoft.DocumentDB/databaseAccounts’)

This expression checks the event body for a data object whose operationName property is the Microsoft.DocumentDB/databaseAccounts operation. See also Event Grid event schema.

The final step is to add an action in the true branch. And this is an action to sent an email to an address with a subject and body.

To test this create a Cosmos DB instance, wait until its provisioned and the email notification.

Note: Azure Resource Manager, Event Hubs Capture, and Storage blob service are launch publishers. Hence, this sample is just an illustration and will not actually work!

Call to action

Getting acquainted with this new service was a good experience. My feeling is that this service will be a gamechanger with regards to building serverless event driven solution. This service in conjunction with services like Logic Apps, Azure Functions, Storage and other services bring a whole lot of new set of capabilities not matched by any other Cloud vendor. I am looking forward to the evolution of this service, which is in preview currently.

If you work in the integration/IoT space than this is definitely a service you need to be aware and research. A good starting point is : Introducing Azure Event Grid – an event service for modern applications and this infoq article.

Author: Steef-Jan Wiggers

Steef-Jan Wiggers is all in on Microsoft Azure, Integration, and Data Science. He has over 15 years’ experience in a wide variety of scenarios such as custom .NET solution development, overseeing large enterprise integrations, building web services, managing projects, designing web services, experimenting with data, SQL Server database administration, and consulting. Steef-Jan loves challenges in the Microsoft playing field combining it with his domain knowledge in energy, utility, banking, insurance, health care, agriculture, (local) government, bio-sciences, retail, travel and logistics. He is very active in the community as a blogger, TechNet Wiki author, book author, and global public speaker. For these efforts, Microsoft has recognized him a Microsoft MVP for the past 7 years.

Azure Logic Apps OMS Monitoring – PREVIEW

Azure Logic Apps OMS Monitoring – PREVIEW

The Azure Logic Apps team announced the preview version for Azure Logic Apps OMS Monitoring. Microsoft terms this release as “New Azure Logic Apps solution for Log Analytics”. The basic idea behind this brand new experience is to monitor and get insights about the Logic App runs with Operations Management Suite (OMS) and Log Analytics.

The new solution is very similar to the existing B2B OMS portal solution. Azure Logic Apps customers can continue to monitor their Logic Apps easily either via the OMS portal, Azure or even on the move with the OMS app.

What’s new in the preview of Azure Logic Apps OMS Monitoring Portal?

  • View all the Logic Apps run information
  • Status of Logic Apps (Success or Failure)
  • Details of failed runs
  • With Log Analytics in place, users can also set up alerts to get notified if something is not working as expected
  • Easily/quickly turn on Azure diagnostics in order to push the telemetry data from Logic App to the workplace

Enable OMS Monitoring for Azure Logic Apps

Follow the steps as listed below to enable OMS Monitoring for Logic Apps:

  1. Log in to your Azure Portal
  2. Search for “Log Analytics” (found under the list of services in the Marketplace), and then select Log Analytics.
  3. Click Create Log Analytics
  4. In the OMS Workspace pane,
    1. OMS Workspace – Enter the OMS Workspace name
    2. Subscription – Select the Subscription from the drop down
    3. Resource Group – Pick your existing resource group or create a new resource group
    4. Location – Choose the data center location where you want to deploy the Log Analytics feature
    5. Pricing Tier – The cost of workspace depends on the pricing tier and the solutions that you use. Pick the right pricing tier from the drop down.
      Azure Logic Apps OMS Monitoring
    6. Once you have created the OMS Workspace, create the Logic App. While creating the Logic App, enable Logic Analytics by pointing to the OMS workspace. For existing Logic Apps, you can enable OMS Monitoring from Monitoring > Diagnostics > Diagnostic settings.
      Azure Logic Apps OMS Monitoring
    7. Once you have created the Logic App, execute the Logic App with some run information
    8. Navigate back to the OMS Workspace that you created earlier. You will notice a message at the top of your screen asking you to upgrade the OMS workspace. Go ahead and do the upgrade process.
      Azure Logic Apps OMS Monitoring
    9. Click Upgrade Now to start the Upgrade process
      Azure Logic Apps OMS Monitoring
    10. Once the upgrade is complete, you will see the confirmation message in the notifications area.
      Azure Logic Apps OMS Monitoring
    11. Under Management section, click OMS Portal
      Azure Logic Apps OMS Monitoring
    12. Click Solutions Gallery on the left menu
      Azure Logic Apps OMS Monitoring
    13. In the solutions list, select Logic Apps Management solution
      Azure Logic Apps OMS Monitoring
    14. Click Add to add the Logic Apps monitoring view to your OMS workspace. Note that this functionality is still in preview at the time of writing this blog.
      Azure Logic Apps OMS Monitoring
    15. You will see the status of your Logic App (No. of Runs, count of succeeded, running, and failed runs)
      Azure Logic Apps OMS Monitoring
      NOTE: The Logic Apps run data did not appear immediately for me. I could see this data only in my third attempt (after selecting the region as West Central US, thanks to the tip from Steef-Jan Wiggers). Steef has also written a blog post about the Logic Apps and OMS integration capabilities. Therefore, please be ready to wait for some time to see the Logic App status on the OMS dashboard.
    16. Click the Dashboard area to view the detailed information
      Azure Logic Apps OMS Monitoring
    17. You can drill down the report by clicking on a particular status and viewing the detailed information
      Azure Logic Apps OMS Monitoring
    18. Click the record row to examine the run information in detail
      Azure Logic Apps OMS Monitoring

Therefore, you can now configure Monitoring and Diagnostics for Logic Apps directly into the OMS Portal which is very similar to the B2B messaging capabilities that existed earlier. I hope you found this blog useful in setting up Azure Logic Apps OMS Monitoring. I’m already excited for the next preview features to be rolled out from the Azure Logic Apps team.

Author: Sriram Hariharan

Sriram Hariharan is the Senior Technical and Content Writer at BizTalk360. He has over 9 years of experience working as documentation specialist for different products and domains. Writing is his passion and he believes in the following quote – “As wings are for an aircraft, a technical document is for a product — be it a product document, user guide, or release notes”.

Have you Backed up your BizTalk360 Database?

Have you Backed up your BizTalk360 Database?

Think about all the information that is stored in your BizTalk360 database – Alarms, Knowledge Base, various Settings. This data is very important for many reasons. Now imagine if all of the information just disappeared.

Although it is a scary thought, it is highly unlikely that your company is not backing up your database. Let’s talk about how you can setup a Database Maintenance Plan.

Continuing with the Support Series to discuss common occurring issues with our customers and how others can benefit from the knowledge gained when we helped them resolve the same I will now focus on 1 key issue – Maintenance and Backup of your BizTalk360 database.

The Issue

Customer just found out that the SQL transaction logfile is taking enormous amount of space

Back up your BizTalk Database

If you check your SQL, the SQL transaction log file of our BizTalk360 database is very large (over 19GB).

Back up your BizTalk Database

The Solution

The quick Support reply in this scenario would be to refer to this article and resolve the issue by Releasing unused space and forcing a backup.

https://assist.biztalk360.com/support/solutions/articles/1000142821-huge-sql-log-file-size

Identifying the Issue

The transaction log grows to be inordinately large on a database that’s in FULL or BULK_LOGGED recovery mode. This happens after a database backup has been taken, which switches the log into a mode where it won’t truncate until it’s been backed up.

In these circumstances, if you don’t take a transaction log backup, the log will continue to grow.

Our purging policies in BizTalk360 are as follows (defaults):

backup biztalk database

BizTalk360 purging will not stop the transaction log files from growing. In order to maintain healthy BizTalk360 Database please ensure you have maintenance plans configured:

One to backup BizTalk360 Database and Log every week and a second one to delete backup files which are older than 2 weeks.

You may also face this problem when you configured quite a large number of alarms and these alarms are monitoring a large number of artifacts.

Setting up BizTalk360 Database Maintenance Plan – Backup

I am using the Microsoft SQL Server Management Studio.

  1. Select the Maintenance Plan Wizard.backup biztalk database
  2. Select Single Schedule and then Click the Change Button.

backup biztalk database

  1. You can select probably a backup of your BizTalk360 database every week – on a day when you are not expecting too many transactions.

backup biztalk database

  1. Select the Back-up Tasks Tick boxes.

backup biztalk database

  1. Select the BizTalk360 Database for each Screen (Full), (Transaction) backup.

backup biztalk database

Setting up the BizTalk360 Database Maintenance Plan – Clean-up Task

Now we need to do the setup for a Clean-up Task. We suggest that you can make this run every 2 weeks, and we need to select the Clean-up Task instead of Backup in the Maintenance Wizard.

backup biztalk database

Lastly just provide the details for deleting the files (File Age) as shown below.

backup biztalk database

NOTE: Make sure your SQL Server Agent is running on the server.

You can adjust the schedule depending on a load of your environment.

So I hope this Blog has given you some helpful information to ensure that your Transaction Log size doesn’t increase when the backups and clean-up Tasks are managed and run properly with BizTalk360.

If you have any questions, contact us at support@biztalk360.com. Also, feel free to leave your feedback in our forum.

Author: Rochelle Saldanha

Rochelle Saldanha is currently working in the Customer Support & Client Relationship Teams at BizTalk360. She loves travelling and watching movies.

Microsoft Integration Weekly Update: Aug 14

Microsoft Integration Weekly Update: Aug 14

Do you feel difficult to keep up to date on all the frequent updates and announcements in the Microsoft Integration platform?

Integration weekly update can be your solution. It’s a weekly update on the topics related to Integration – enterprise integration, robust & scalable messaging capabilities and Citizen Integration capabilities empowered by Microsoft platform to deliver value to the business.

If you want to receive these updates weekly, then don’t forget to Subscribe!

On-Premise Integration:

Cloud and Hybrid Integration:

Feedback

Hope this would be helpful. Please feel free to let me know your feedback on the Integration weekly series.

Advertisements

Most awaited Azure Table Storage Connector is available now!

Most awaited Azure Table Storage Connector is available now!

Recently Azure Table Storage Connector has been released in preview.

Now, the connector is only available in West Central US. Hopefully soon it will be rolled out to other data centers.

To play around this connector I created this very simple Logic App which pulls an entity from the table storage.

I have a table storage called RobustCloudIntegrationWithAzure as shown below.

This table basically stores all the authors and chapters name of the book Robust Cloud Integration With Azure.

The author or the chapter is the partition key and the sequence number is the row Key. To get any author or chapter details, you need to pass partition key and the row key to the logic app

First you need to make a connection to the Azure Storage table, by providing the Storage Account Name with Shared Storage Key. You also need to give a name to your connection.

Once you have made the connection successfully, you can use any action of CRUD operation. In this case I am using Get Entities which is basically a select operation.

Once you have selected the table, you have option to user Filter and Select OData query. In the Filter Query I have condition to check for partionKey which is coming from input request. In the Select Query, you can choose the columns of the table to display.

So, this logic app receives an request with the partionKey and rowKey as inputs.

Then it checks the value of partitionKey. If a partitionkey is equal to the author, Author action would be executed, else Chapter action. Depending on the partitionkey, either author or chapter details will be sent out as the response from the Logic App workflow

Here are the sample request and response using Postman.

Author

Chapter

Conclusion

Azure Table Storage Connector was one of the most voted request to Logic App team and now it’s available to use.

Advertisements

Our experience at Design Thinking Summit 2017

Our experience at Design Thinking Summit 2017

Design Thinking –the name sounds different. Can we design our thinking? Yes, we can and this is what we learnt from the Design Thinking Summit 2017 which was held at IIM Bangalore. We are grateful to our organization for providing us such a wonderful opportunity to participate in this event.

BizTalk360 always focus on the motto “You grow, we grow, together we grow”. In this way, they always help the employees acquire skills through different learning and training programs. One such opportunity was given to 6 of us to attend the Design Thinking Summit 2017 and I am lucky to be the one among them. In this blog, I would like to share my experiences in the DTSummit. Special thanks to Saranya and Umamaheshwaran for adding more meaning to this blog by sharing their experiences.

Design Thinking Summit 2017

An intro to Design Thinking Summit – Insight:

Design Thinking is a creative, repeatable, human centered approach to problem-solving and innovation. It draws upon experience, imagination, intuition, and reasoning to explore possibilities of what could be—to create desired outcomes that benefit your customers. This summit was organized by a group called Pensaar, powered by a team of highly experienced design thinkers and problem solvers. Over the 3-day workshop conducted by Pensaar team, we learnt how to understand customers, articulate insights that will inspire innovation, ideate till you get disruptive ideas that we can rapidly test with customers. It is focused on learning by doing. All while experimenting, experiencing, having fun and being surprised. There were around 160 participants this year for the DTSummit.

Day 1 at the Design Thinking Summit:

It was all new for our BizTalk360 team about the event. We were asked to assemble in the event venue at 8.30 AM. To our surprise, the participants were split into different teams and each one of us was in the different team. This was a nice experience as we got to know different people as the participants were people from different professionals. We were given cards with our photo attached and the table number written on it. Everything was a team activity with a team coordinator for each team.

Design think involves four stages namely

  • Discover – understand people and their ideas
  • Insight – Identify trends and inspire innovation
  • Dream – Ideate solutions for problem statements defined
  • Disrupt – Prototyping techniques that visualize solutions

Design Thinking Summit 2017

The first day was about “Insight”. The first step towards insight is “Discover”. The foremost task is to understand people and their ideas. The “Insight” stands for identifying trends and pattern of data which will inspire innovation. The below quote explains it.

Fall in love with the problem and not with the solution

Products must be created for behaviors and not for intentions

The first day started with an event to come up with an innovative team name for each team. The stationeries were provided along with post-its. An interesting this to be noted at the venue (IIM- Bangalore) was that plastics are banned and we were given glass water bottles with our names printed on them. There were around 12 teams and each team came up with unique names.

The interesting interview:

The next event was an interesting interview with a reputed industrialist. The aim was to capture the insights of the person and utilize them for a better understanding of the requirements.  We were asked to listen to the interview and note down our points in the post – its. The important feature of the post is that we cannot write long stories in it. The notes must be short and understandable. Hence, we need to make sure we have better words to describe our points and ideas. Some of the key insights derived from the interview were:

  • Be focussed on process
  • Build expertise and use them when opportunity is given
  • Soft skills to be more focused.

For example, consider a scenario where we gather the requirements for a product from a customer. The skills to be observed in this process is:

  • Asking open ended questions
  • Listening skills
  • Observing skills

The Research Methodology:

Once the customer requirements are gathered, the next step is to dig deep into them for better understanding. One of the research methodologies was:

Ecosystem Map:

This is a visual representation of landscape within which a problem exists. The map contains the connections between the different stakeholders involved in the problem. We can visually depict the interconnections and inter-dependencies between the stakeholders in the system. This way we can draw key inferences and insights by asking questions like, what are the challenges in the system, what can be improved, what interventions can be made to make a positive impact.

Arriving at the problem statement:

We now have the ecosystem map. The next activity is to identify the problem statement. We can consider anyone of the stakeholders and derive the statement for them. The stakeholder may be a customer, an employee, the government or the senior management of the organization. Each individual team member was asked to write down his/her problem statement based upon the following points describing:

  • User characteristics
  • Outcome the user tries to achieve
  • Barrier existing to achieve the outcome
  • Root cause of the barrier
  • How the user feels because of the root cause and the barrier

This problem statement is important because it is from this point, we will move forward in deriving the solution for them. From the individual points, the team coordinator would discuss and come with a single problem statement for the team. The problem statement is written from the user’s point of view and it helps to identify and articulate the right problem to solve for the users.

There are different tools which help us in deriving the problem statement which may be:

Empathy map – mapping the different data points for the user

Subway map – plotting the objectives with respect to the current state and prioritizing them.

Design Thinking Summit 2017

User persona – Using quote cards, we can derive the insights for different problems given in the cards.

Journey line – steps involved from arriving at the problem statement to improving on the solutions

These tools are considered the convergent research technique tools for understanding the problem better. At the end of the first day, the Pensaar team collected the feedback about the activities conducted.

Day 2 at the Design Thinking Summit:

The first day went interesting and the outcome was the problem statement. Now comes the second day of the event. We were all more excited for the second-day activities. The second day started with the Introduction of the Pensaar team, who are behind the screen for this wonderful Summit.

The agenda for this day was “Dream”. The first day resulted in finding the problem statement with the insights obtained from the different groups. Now, we need to walk our way to find the solution for this problem. But that would not be so easy. One problem statement would be worked upon by all the 12 teams. So, there would be different solutions and it’s important that we identify the best solution.

Arriving at the Customer Benefits:

The first activity for the second day was to “Identify three key customer benefits”. Customer benefit leads to improvement in customers’ life. It is what matters most to the customer when choosing our product over others. The benefits can be measured through certain metrics, which help you in identifying right priorities to acquire many customers. It can be done by crafting a creative Q starting with “How might we”. This lets you to reframe the problem as an opportunity and ideate solutions with a sense of optimism and see the possibilities

Lunch break:

There was another surprise waiting for us during the lunch break. It was picnic lunch for each team. The team members had to collect the lunch for their team mates and have under the trees in a different area. This was very interesting and we all enjoyed it.

Tools for Ideating:

The next step is to ideate solutions for the problem statement based on the key customer benefits. This was the next activity given. There are various tools that are used for the ideation and few of them were given for the teams for activity. Few among them are:

Question storming:

This is a method for discovering the questions to make breakthrough differences in problem-solving, innovation, operational excellence and culture. The questions must be focussed on the facts and situation to get the root of a problem.

Emerging Tech cards:

These are small cards containing information about the emerging technologies in different areas. The activity was to identify the relevant tech card and find out how to make use of it in identifying the solution to the problem.

Design Thinking Summit 2017

Biomimicry:

This is drawing inspiration from nature to design the solution. Simply put its mimicking nature to inspire sustainable and innovation solution. We can take an example of ants and their ability to self-organize to find the shortest route. This can be used to find the best solution.

World Café:

This was a post lunch activity. The teams were asked to write the problem statement and the ideas for a different solution. It is to build a collaboration among the teams than to be an individual. So each team member would be visiting other teams to gather knowledge about their ideas and provide some inputs for the improvements.

With this activity, we came to an end for the second day.

Day 3 at the Design Thinking Summit:

The day 3 was even more filled with enthusiasm among the team because we all had new friends and the past two days gave us a different experience. This day started with the activity for “Disrupt”. This will develop prototypes for the solutions derived and then be experimenting them. It started with

Story Board:

It’s a visual tool to build a narrative around the solution to get feedback and refine the concept. The teams were asked to build the story board with their problem statement and the solutions.

Design Thinking Summit 2017

Message Map:

This is an excellent tool to create an elevator pitch to communicate our concept to users in less than 15 secs. The steps include creating a Twitter-friendly message about the solution and adding supporting points to explain it.

Design Thinking Summit 2017

Experimenting the solution:

The final activity of the event was experimenting the solutions. Each team was asked to create an experiment card which includes the hypothesis, the experiment. Metric and the outcome. With this card, we can experiment our solutions with different users and find the outcome. The teams move around IIM to find the users and the filled in those cards according to the responses received. It was totally a different experience where we also traveled out to find the users and got the feedback from them.

Conclusion:

It was totally a fantastic experience for all of us. Design thinking starts from identifying the exact problem statement (Insight), ideating through different solutions (Dream) and experimenting those ideas (Disrupt) for the development of an employee as well as an organization. These tools can also be utilized in our day to day activities for the betterment of our life as well as career. Thanks to BizTalk360 for giving us a chance to participate in this event and looking forward to more such events.

Author: Praveena Jayanarayanan

I am working as Senior Support Engineer at BizTalk360. I always believe in team work leading to success because “We all cannot do everything or solve every issue. ‘It’s impossible’. However, if we each simply do our part, make our own contribution, regardless of how small we may think it is…. together it adds up and great things get accomplished.”