Processing Feedback Evaluations (paper) automagically with SmartDocumentor OCR, Microsoft Flow & Power BI

Processing Feedback Evaluations (paper) automagically with SmartDocumentor OCR, Microsoft Flow & Power BI

Some time ago I wrote a blog post on how to Process Feedback Evaluations (paper) automagically with SmartDocumentor OCR, Logic Apps, Azure Functions and Power BI, at that point there weren’t so many of the functionalities that we have at the moment both on Logic Apps and Microsoft Flow: like concept of variables, Case operations, more expressions an so on. I will not address here in this blog post if can we redesign to be different or better. The question I raise and will address here is: can we do the same with Microsoft Flow, instead of using Logic Apps?

And the reason I ask this question is that Microsoft Flow is more targeted to Business Users which fits perfectly with scenarios in which we want to extend a product to fit each business users’ requirements/scenarios. If you have an Office365 you can use Microsoft Flow and you will have 2,000 runs per month, so you don’t need to pay extra for using Logic Apps, you can instead use Microsoft Flow.

And the answer is: Yes, you can!

The problem and scenario will be exactly the same: How can we easily convert paper in data to generate additional value? How can we perform operations on it and easily gain insights?

Processing Feedback Evaluations Paper: The Problem

But in this case, to solve this problem, in which I wanted to have the evaluations forms to be processed in real-time, i.e., as the attendees handed in the forms, the results were presented in a public Power BI dashboard in a matter of seconds, we will be using:

  • DevScope SmartDocumentor OCR that, not also allowed me to extract the data from my documents and easily integrate with other systems, but also to intelligently set my OCR streams (flows), defining:
    • Different receive locations, like FTP, file or directly from scanner devices;
    • Create/test my recognition templates and review/ validate the data which is gathered;
    • But also, enabled me to connect and send the metadata, XML or JSON, through any kind of HTTP service, I could even extend it by using a PowerShell provider that would enable me to execute a PowerShell script.
  • Microsoft Flow that allows all type of Business Users to create and automate workflows across multiple applications and services without the need for developer help in a very simple and fast way. These automated workflows are called flows.
  • And finally, Power BI to create interactive data visualization (dashboards and reports)

Processing Feedback Evaluations Paper: The Solution

Processing Feedback Evaluations: The solution

SmartDocumentor: to process and extract metadata from paper

Again, I’m not going to explain in details how the solution is implemented inside DevScope’s SmartDocumentor for it is not the point of this article, and if you want to know more about it, you can always contact me. However, let me contextualize you:

  • SmartDocumentor OCR flow will be listening in two receive locations: Share Folder and directly from the scanner device;
  • After receiving and extract the data from the scanned documents (paper), SmartDocumentor will send the metadata to a Microsoft HTTP endpoint.

SmartDocumentor OCR Review Station: Survay Template

  • Inside the SmartDocumentor processes, we optional can specify if we want to review the documents – SmartDocumentor Review Station – before you sent to the Microsoft Flow (or any other system).

02-Flow-Processing-Feedback-Evaluations-paper-SmartDocumentor-Flow-Solution

Power BI to deliver interactive data visualization (dashboards and reports)

Regarding Power BI, Microsoft Flow Power BI connector (is the same of the Logic Apps connector) only accepts you to use streaming datasets (this has advantages and some disadvantages that we will see further on), that allows you to easily build real-time dashboards by pushing data into the REST API endpoint. To create your streaming datasets, you should access to Power BI with your account:

  • Select your ‘Workspace à Datasets’, and then on the top right corner click ‘+ Create’ and then ‘Streaming dataset’

Processing Feedback Evaluations Paper: Create Power BI Streaming Dataset

  • In the ‘New streaming dataset’, select ‘API’ and then click ‘Next’
  • In the second ‘New streaming dataset’, give a name to your dataset: “FeedbackForm” and then add the following elements:
    • SpeakerName (Text) – represents the name of the speaker that is obtained in the evaluation form according to the session.
    • ContentMatureFB (Number) – a value between 1 and 9 that is obtained in the evaluation form
    • GoodCommunicatorFB (Number) – a value between 1 and 9 that is obtained in the evaluation form
    • EnjoySessionFB (Number) – a value between 1 and 9 that is obtained in the evaluation form
    • MetExpectationsFB (Number) – a value between 1 and 9 that is obtained in the evaluation form
    • SpeakerAvrg (Number) – A simple average calculation (sum all the above metrics divide by 4)
    • WhoAmI (Text) – represents the type of attendee you are (developer, architect, …) that is obtained in the evaluation form
    • SpeakerPicture (Text) – picture of the speaker according to the session that is that is obtained in the evaluation form.

Processing Feedback Evaluations Paper: Power BI Streaming Dataset

  • And because we want to create interactive reports in order to have more insights from the event. We need to enable ‘Historic data analysis’ and then click ‘Create’

Unfortunately, streaming dataset is meant to be used for real-time streaming and is a little limited in terms of want we can do with it. For example, it doesn’t allow you to combine different sources, for example, a “table” that can correlate speaker to get their pictures, or to make aggregations of metrics like “Speaker average”. Which means that we need to send all of this information from Microsoft Flow.

Microsoft Flow to create an integration process flow

At the beginning of this article, I told that one of the advantages of Microsoft Flow is that allows Business Users to create and automate workflows across multiple applications and services without the need for developer help. So, for this example, I will try to abstract from the fact that I am a developer and “try” to implement as if I were a business user: no custom code is allowed!

Microsoft Flow it will be the component that will allow us to extend a product (OCR software) that, as all products, are limited to certain features and capabilities to be able to communicate with a huge range of SaaS applications that appear all day on the market.

In order to integrate SmartDocumentor OCR with Power BI, we need to create a Microsoft Flow that:

  • Accept a JSON through an HTTP Post. For that, we use a ‘Request – When a HTTP request is received’ trigger.

Processing Feedback Evaluations: Flow process - HTTP trigger

    • And, because we do have friendly tokens to access the elements of the message, we will use a sample of the JSON message to be able to generate the correct JSON Schema. For that you should:
      • From the trigger configuration box, select the option “Use sample payload to generate schema” and past the above sample:

[
{
“Key”: “RecognitionRating”,
“Value”: “100”
},
{
“Key”: “RecognitionStatus”,
“Value”: “Final”
}
]

      • To generate the JSON schema, click “Done”

Processing Feedback Evaluations: Flow process - HTTP trigger Schema

  • Next, we need to create the following support variables, that we will use to store the data extracted from the SmartDocumentor JSON message – that is a Name/value typed message – to be sent to Power BI dataset:
NameTypeInitial Value
countInteger0
speakerNameString
PictureString
ContentClearString
GoodCommunicatorString
EnjoySessionString
MetExpectationString
whoAmIString
    • For that, we need to add a ‘+New step’, ‘Add an action’ and then enter Variables’ into the search box. Select “Variables – Initialize variable” from the actions list.

Processing Feedback Evaluations: Flow process - Variables

    • And repeat the same task for all the variables.

Note: Unfortunately, to the date, there is still no way to create multiple variables using a single shape. In my opinion, this could and should be made as a table approach instead of the existing approach.

Now that we have all our variables that we need to store the information that we need to extract from the SmartDocumentor JSON message, we need to start extracting based on the Evaluation form. The first part of the form is: “What session are you evaluating”:

Processing Feedback Evaluations: Flow process -Evaluation Form

and for us to find out what session is being selected so we can “map” the speaker and picture name we need to look for the keys “S1”, “S2”, “S3” and “S4” and see which on them is filled. To do that we need to:

  • Add a ‘+New step’, ‘… More’ and then select ‘Add a switch case’

Processing Feedback Evaluations: Flow process - Switch case

    • On the “On” property of the switch condition configuration, select from the list of tokens the “Key” token from the “When a HTTP request is received” trigger

Processing Feedback Evaluations: Flow process - Switch case configuration

    • Because this is a Key/Value messages with multiple records, the Flow designer will automatically place this switch condition inside a Loop that will iterate thru each key/value pair.
  • In the first case, let rename it ‘Case S1’ branch configuration:
    • On the “Equals” property, type “S1”
    • And then add a new condition by selecting ‘… More’ and then select ‘Add a condition’
      • On the ‘Yes’ branch
        • Choose the “Add an action” option and on the “Choose an action” window, enter “Variables” and select the trigger “Variables – Set variable”
        • And set the “Picture” variable to the desired value
        • Do the same steps this time to set the “speakerName” variable
      • Leave the ‘No’ branch as it is

Processing Feedback Evaluations: Flow process - Switch case 1

  • Add 3 more new Case branches and repeat the same steps, this time configuring the values to look for “S2”, S3” and “S4”.

The second part of the form is: “What is my evaluation”:

Processing Feedback Evaluations: Flow process - Evaluation Form

To extract this information, and because we already have a Switch case condition in place, what we need to do is:

  • Create 4 new different branches in the Switch case condition, one for each question – “Q1”, “Q2”, “Q3” and “Q4” – and then:
    • On the “Equals” property, type “Q1”
    • And then Choose the “Add an action” option and on the “Choose an action” window, enter “Variables” and select the trigger “Variables – Set variable”
      • And set the “ContentClear” variable to the rate provided by selecting from the list of tokens the “Value” token from the “When a HTTP request is received” trigger
    • Do the same steps for
      • “Q2” to set the variable “GoodCommunicator”
      • “Q3” to set the variable “EnjoySession”
      • “Q4” to set the variable “MetExpectation”

Processing Feedback Evaluations: Flow process - Set variables

Finally, the last section is about “Who am I”

12-Flow-Evalation-Who-am-I

This is probably the more complicated section. Here, we basically want to travel from a range of position and if this is selected we append to a string separated by a comma, something like “Developer; Student”. To do that we need to:

  • On the Default branch, add a new condition by selecting ‘… More’ and then select ‘Add a condition’
    • In the condition expression, select “Edit in advanced mode” and then type the condition to treat the key/name pair if “count” variable is >13 and less or equal with 28
@and(greater(variables('count'), 13),lessOrEquals(variables('count'), 28))

Processing Feedback Evaluations: Flow process - condition advance expression

    • On the ‘Yes’ branch
      • We need to check if “tag” is selected or not. If yes set we need to add to the “whoAmI”, otherwise we don’t do nothing. To do that we need an extra if condition: Add a new condition by selecting ‘… More’ and then select ‘Add a condition’
        • On “Choose a value” property select the “Key” token from the “When a HTTP request is received” trigger.
          • On the condition set “Is not equal to”
          • And in the other “Choose a value” property leave it empty

Processing Feedback Evaluations: Flow process - Anothe condition

      • However, to set properly the “whoAmI” variable we need another additional condition to check if the “whoAmI” variable is empty or not. If empty set the “whoAmI” variable with the value, otherwise append to the existing value a comma and the new value. To do that we need to:
        • Add a new condition by selecting ‘… More’ and then select ‘Add a condition’
          • On “Choose a value” property select the “whoAmI” token from the “Variable” context.
          • On the condition set “Is not equal to”
          • And in the other “Choose a value” property leave it empty
      • On the “Yes” condition, choose the “Add an action” option and on the “Choose an action” window, enter “Variables” and select the trigger “Variables – Set variable”
        • And set the “whoAmI” variable to the desired value – select the “Key” token from the “When a HTTP request is received” trigger.
      • And on the “No” condition, choose the “Add an action” option and on the “Choose an action” window, enter “Variables” and select the trigger “Variables – append to string variable”
        • And set the “whoAmI” variable to the desired value – select the “Key” token from the “When a HTTP request is received” trigger.

Processing Feedback Evaluations: Flow process - condition chain

  • Finally, after the Switch case condition, and just before finishing the “Apply to each” cycle
    • Choose an action” window, enter “Variables” and select the trigger “Variables – Increment variable”
      • On the “Name” property select the “count” variable
      • And on the “Value” set as 1

Processing Feedback Evaluations: Flow process - increment count

To finalize the entire process, we just need to:

  • Calculate the rate average – the sum of all question results divided by the number of questions.
  • And send it to Power BI

To accomplished that we need to:

  • Add a ‘+New step’, ‘Add an action’ and then enter “Variables” into the search box. Delect the trigger “Variables – Initialize variable”
    • On the “Name” property select the “SpeakerAvrg” variable
    • On the type property set to be Integer
    • And in the Value property, from the context dialogue box, select “Expression” tab and set the following expression:
int(div(add(add(int(variables('ContentClear')), int(variables('GoodCommunicator'))) , add(int(variables('EnjoySession')), int(variables('MetExpectation')))),4))

Processing Feedback Evaluations: Flow process - calculate average

  • In the last step of the Logic App: we push the data into the Power BI streaming dataset created earlier by using the new ‘Power BI’ Connector. To do this we need to:
    • Add a ‘+New step’, ‘Add an action’, and then enter ‘Power BI’ into the search box. Select “Add row to streaming dataset” from the actions list.
      • Select the name of the workspace and then the name of the streaming dataset
      • The next step is to select the Table titled “RealTimeData”
      • And finally, map the properties with the different variables tokens has in the picture

Processing Feedback Evaluations: Flow process - Power BI

Give a proper name to the flow and save it

The end result

After saving the Microsoft Flow and process the Evaluation forms, the result is this beautiful and interactive report that we can present in a monitor during the breaks of our events:

Processing Feedback Evaluations paper: SmartDocumentor Logic App process Power BI dashboard

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc. He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.

BizTalk Pipeline Components Extensions Utility Pack for BizTalk Server 2016 available on GitHub

BizTalk Pipeline Components Extensions Utility Pack for BizTalk Server 2016 available on GitHub

I finally started and published the first version of the sibling BizTalk project Mapper Extensions UtilityPack for BizTalk Server 2016: BizTalk Pipeline Components Extensions Utility Pack

What is BizTalk Pipeline Components Extensions Utility Pack?

BizTalk Pipeline Components Extensions Utility Pack is a set of custom pipeline components (libraries) with several custom components that can be used in received and sent pipelines, which will provide an extension of BizTalk out-of-the-box pipeline capabilities.

BizTalk Pipeline Components Extensions Utility Pack

What’s to expect in this version?

Content-Based Routing Pipeline Components

CBR IDoc Operation Promotion Encode component (CBRIdocOperationPromotionEncode)

  • Content-Based Routing Component to promote IDOC Operation property.
    • This component requires one configuration that is the MessageType string to be ignored. Then it will take the last string (word) from the MessageType Message Context Property and promote it to the Operation Message Context Property.
    • This component is to be used on the Encode stage of BizTalk Server Send Pipelines and to be used exclusively on Send Ports.

CBR Operation Promotion Encode component (CBROperationPromotionEncode)

  • Content-Based Routing Component to promote Operation property.
    • This component doesn’t require any configuration. Then it will take the value (word) which lies ahead of the cardinal (#) from the MessageType message context property and promote it to the Operation Message Context Property.
    • This component is to be used on the Encode stage of BizTalk Server Send Pipelines and to be used exclusively on Send Ports.

The project is available on BizTalk Server Open Source Community repository on GitHub (https://github.com/BizTalkCommunity) and everyone can contribute with new pipeline components that can extend or improve the existing BizTalk Server capabilities.

At the moment it is only available for BizTalk Server 2016 but it will soon be compiled and available for previous versions of the product.

Special thanks to my team coworker at DevScope: Pedro Almeida for helping me on this project.

Deploying Pipeline Components

All the .NET pipeline component assemblies (native and custom) must be located in the Pipeline Components folder to be executed by the server. If the pipeline with a custom component will be deployed across several servers, the component’s binaries must be present in the specified folder on every server.

You do not need to add a custom pipeline component to be used by the BizTalk Runtime to the Global Assembly Cache (GAC).

To know more about Deploying Pipeline Components, please see: Deploying Pipeline Components

Where to download?

You can download BizTalk Pipeline Components Extensions Utility Pack from GitHub here:
BizTalk Pipeline Components Extensions Utility Pack
GitHub

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc. He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.

Microsoft BizTalk Mapper Unable to load wrapper

Microsoft BizTalk Mapper Unable to load wrapper

Sometimes errors seem to find me, sometimes it fills that I’m chasing errors! And because someone did something that you shouldn’t do, configuring the build folder of a Functoid project to be <BizTalk Server installation folder>Developer ToolsMapper Extensions directory, I got a new and strange error while trying to open a BizTalk map inside Visual Studio: Microsoft BizTalk Mapper Unable to load wrapper. Here is the full error message:

Microsoft BizTalk Mapper

Unable to load wrapper <BizTalk Server installation folder>Developer ToolsMapper ExtensionsMicrosoft.BizTalk.Interop.Agent.dll

Microsoft BizTalk Mapper Unable to load wrapper

Of course, at that time I was quite curious to know the cause of this problem because I didn’t do anything wrong, just building a custom Functoid project and a few minutes before I actually was working in a map and I was able to open without any kind of problems.

Cause

When you build a project, by default any kind of project it will add the project assemblies as well as the dependencies to the binDebug – again this is the default behavior.

In this case, by changing this behavior in a Custom Functoid project to the <BizTalk Server installation folder>Developer ToolsMapper Extensions directory, was causing the problem, because it was adding not only the custom Functoid assembly but also some BizTalk internal assemblies, in this case the Microsoft.BizTalk.Interop.Agent.dll, that are incompatible with the internal BizTalk Mapper wrapper.

Microsoft BizTalk Mapper Unable to load wrapper cause

Note: BizTalk Server installation folder>Developer ToolsMapper Extensions directory is where Visual Studio BizTalk project looks for custom functoids to be added to the Toolbox.

Solution

Once again, the solution is simple, you should delete all the unnecessary assemblies, especially the internal ones of the product.

Once you delete all the unnecessary assemblies, in this case, the Microsoft.BizTalk.Interop.Agent.dll, and I also advise to delete all the ones that are selected on the above picture, the Unable to load wrapper problem will be solved and you can once again open the BizTalk Mapper without any kind of problems inside Visual Studio

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc. He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.

How to fix BizTalk Server 2016 SSO Administration Console with PowerShell

How to fix BizTalk Server 2016 SSO Administration Console with PowerShell

Have you ever noticed that your SSO Administration Console tool doesn’t open in BizTalk Server 2016? Fortunately for Microsoft BizTalk Server team this tool is not heavily used by the customer, nevertheless, this is an existing and valid tool that needs to be working properly.

What is this tool?

You can install the Enterprise Single Sign-On (SSO) Administration component as a stand-alone feature. This is useful if you need to administer the SSO system remotely. The hardware and software requirements are the same as for a typical Enterprise SSO installation.

After you install the administration component, you must use either ssomanage.exe or the SSO Administration MMC snap-in to specify the SSO server that will be used for management. Both processes are included in the procedure that follows.

Of course, with this tool, SSO Administration, you can do more than just configure the SSO server that will be used for management. Using the Enterprise SSO Administration console, administrators can easily manage Affiliate Applications, Mappings, SSO Servers, SSO System and also perform Password Management operations. There are 4 snap-ins for Enterprise SSO that administrators can use.

  • Affiliate Applications – Administrators can use this to perform administrative operations on Affiliate Applications. For each Affiliate Application, mappings can be created and managed. An Affiliate Application can be created by SSO Affiliate Administrators and SSO Administrators. When it is defined, the administrator can optionally specify an Application Administrators account that contains users who can perform administrative operations on that Affiliate Application. In addition, an Application Users account must be specified that contains Windows domain users for whom mappings can be created. Other operations such as enabling or disabling Affiliate Applications, configuring SSO tickets for the Affiliate Application, and enabling or disabling mappings can also be performed by administrators using this snap-in.
  • Password Management – Administrators can use this snap-in to perform administrative operations on Password Synchronization Adapters and Password Filters. Administrative operations need to be performed by the SSO Administrators. A filter rule can also be defined within an Adapter configuration. Once an Adapter or Filter is created, an administrator can associate Affiliate Applications with the Adapter or Filter so that the synchronization and filter rules defined are applied to that application.
  • Servers – Administrators can build a list of SSO Servers to perform certain administrative operations and to view their status. Within an SSO system, an administrator can also perform a discovery process to automatically discover and add all the SSO Servers within the SSO system.
  • System – Administrators can view SSO System level settings. These settings are stored in the centralized SSO Credential Database. Modifying these settings will apply to all SSO Servers that are using this SSO Credential Database. In addition, administrators can manage the Master Secret Server and perform tasks such as generate, backup and restore the secret. The system-level administrative operations can be performed only by the SSO Administrators.

BizTalk Server 2016 SSO Administration Console

However, SSO Administration shortcut that points to the Microsoft.EnterpriseSingleSignOn.StartMMC.exe executable file is not working properly in BizTalk Server 2016.

Cause

The reason why this problem is happening is that with BizTalk Server 2016 there is a bug and the installation wizard doesn’t create all the necessary keys in the Register.

Also, if you look at the properties of the “SSO Administration” shortcut, this is pointing to the Microsoft.EnterpriseSingleSignOn.StartMMC.exe executable file. I don’t know the reason why Microsoft decided to create this strategy but in reality, this executable file is just “an easy way” (or dummy way) to open the “ENTSSO.msc”, normally present in the “C:Program FilesCommon FilesEnterprise Single Sign-On” folder and this executable reads the “ENTSSO.msc” installation path from the Register:

…
key = Registry.LocalMachine.OpenSubKey(@"SoftwareMicrosoftENTSSO");
object obj2 = key.GetValue("InstallPath") as string;
…
string str2 = """ + str + "entsso.msc"";
ProcessStartInfo startInfo = new ProcessStartInfo("mmc.exe") {
    Arguments = str2
};
Process.Start(startInfo);
…

However, this is the aspect of the keys within BizTalk Server 2016:

BizTalk-Server-2016-ENTSSO-Registry

Solution

The solution is easy, and you have very different ways to do it.

If you want to open the SSO Administration snap-in, you can:

  • Go to the Enterprise Single Sign-On installation folder
    • Normally, “C:Program FilesCommon FilesEnterprise Single Sign-on”
  • And directly execute the “ENTSSO.msc” (double click)

If you want to fix the SSO Administration shortcut to work properly, you can:

  • Open the Register
  • And on the HKWY_LOCAL_MACHINESOFTWAREMicrosoftENTSSO add
    • the following string: InstallPath
    • with the value: C:Program FilesCommon FilesEnterprise Single Sign-On

BizTalk Server 2016: PowerShell to fix the SSO Administration Console

Because in a normal situation there are more keys (strings) present in the Register. I decided to create a simple PowerShell script that you can use to fix all these bugs and put everything working normally again:

... 
Set-ItemProperty -Path $registryPath -Name InstallPath -Value "C:Program FilesCommon FilesEnterprise Single Sign-On" 
Set-ItemProperty -Path $registryPath -Name ProductCode -Value "{F89B22BC-2768-4237-B300-5CFA52D9AC84}" 
...

THIS POWERSHELL IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND.

BizTalk Server 2016: PowerShell to fix SSO Administration ConsoleBizTalk Server 2016: PowerShell to fix the SSO Administration Console (2 KB)
Microsoft | TechNet Gallery

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc. He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.

Visual Studio error: Unable to find transmitPipeline.vstemplate. Please repair the product to fix this issue

Visual Studio error: Unable to find transmitPipeline.vstemplate. Please repair the product to fix this issue

My blog backlog is full of precious things and I stop counting the number of pages I have on my OneNote to be published. Today I randomly selected this Unable to find transmitPipeline.vstemplate error that I got on December 4, 2017, in a client when I try to do something – unfortunately I don’t remember what I was trying to do – on Visual Studio. The full error message was:

Unable to find “C:program Files (x86)Microsoft Visual Studio 14.0Common7IDEItemTemplatesCacheBizTalkPipeline FilesTransmitPipeline.ziptransmitPipeline.vstemplate”.

Please repair the product to fix this issue.

Unable to find transmitPipeline.vstemplate

Cause

Well, I don’t know the exact reasons why this happened or why these files were missing from the development machine.

However, and that is clear from the error message description. The visual studio templates, in this case, the Transmit Pipeline template was missing from the expected directory.

Solution

The easy solution for this problem is quite simple: use the BizTalk Server ISO file to repair the BizTalk Server installation.

This will “install/copy” all the files, including all BizTalk Server visual studio templates to the development machine.

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc. He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.

Backup BizTalk Server job failed. Executed as user NT SERVICESQLSERVERAGENT. Cannot open backup device destination path.

Backup BizTalk Server job failed. Executed as user NT SERVICESQLSERVERAGENT. Cannot open backup device destination path.

Another day, another error to report directly retrieved from my backlog to be published. Today is about another error that can happen on the Backup BizTalk Server job: “Executed as user NT SERVICESQLSERVERAGENT. Cannot open backup device destination pathdatabase name“. I got this error a few months ago in a client while doing an installation assessment:

Executed as user: NT SERVICESQLSERVERAGENT. Cannot open backup device ‘C:Program FilesMicrosoft SQL ServerMSSQL13.BIZTALKMSSQLBackup<destination path>NAME_BAMPrimaryImport_Log_BTS_2017_12_18_15_07_41_447.bak’. Operating system error 123(The filename, directory name, or volume label syntax is incorrect.). [SQLSTATE 42000] (Error 3201) BACKUP LOG is terminating abnormally. [SQLSTATE 42000] (Error 3013). The step failed.

Backup BizTalk Server job: Executed as user NT SERVICESQLSERVERAGENT. Cannot open backup device destination pathdatabase name

This error was occurring on the third step of the Backup BizTalk Server job: MarkAndBackupLog.

Cause

On the third step, MarkAndBackupLog, of the Backup BizTalk Server job, if you check in input parameters of the stored procedure that is invoked there: the “sp_MarkAll” stored procedure. You will find out that the second parameter is the location of backup files and this location must exist in the file system.

The reason for this error to occur may be related for one of these reasons:

  • This step is not properly configured, and you still have “<destination path>” set as the value for the location of backup files – second parameter;
  • Or the folder/path that you define as the location of backup doesn’t exist;

Solution

You must remember that you need to ensure that all the paths specified in the BizTalk jobs must exist in the file system.

But in this case and as you will see in the error description, the backup job was not properly configured and still have “<destination path>” set as the value for the location of backup files.

To solve this problem, you need to:

  • Press the “Windows key” type “SQL Management” or “SQL” and click in “SQL Server Management Studio”.
  • In Object Explorer panel, connect to the SQL Server instance and expand the server tree.
  • Expand the “SQL Server Agent” node
  • Expand the “Jobs” node
  • Double click “Backup BizTalk Server (BizTalkMgmtDb)” to open the job properties window.
  • In the Job Properties – Backup BizTalk Server (BizTalkMgmtDb) dialog box, under “Select a page”, click “Steps”.
  • In the “Job step list”, click on the job you want to edit, in this case, “MarkAndBackupLog”, and then click “Edit”
  • On the “Command” property correctly specify a path for the backup files

Backup BizTalk Server job: Executed as user NT SERVICESQLSERVERAGENT. Cannot open backup device destination path fixed

After I properly configure the job this error was solved and I was successfully able to run the Backup BizTalk Server job.

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc. He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.

BizTalk Mapper Extensions UtilityPack: Convert to a Number Functoid

BizTalk Mapper Extensions UtilityPack: Convert to a Number Functoid

It’s been a while since I don’t make any changes to one of my favorite pet projects: BizTalk Mapper Extensions UtilityPack! Today I’m happy to announce the release of a new BizTalk Functoid: Convert to a Number Functoid.

Note: And it will not be the only one, soon I will release two new Functoids that I’m working on.

Today I was working on a map that I had the need to transform several numbers delivers in a string with a decimal format:

  • 1.000
  • 121.000

to an Oracle NUMBER(x) – without decimals.

Initial, I thought in create a Scripting Functoid and reuse it inside the map for each element. However, I realize that I will have to use the same transformation in different maps… so I end up creating this simple Functoid.

Convert to a Number Functoid

This functoid allows you to convert a string to a number (integer)

Parameters

The functoid takes three mandatory input parameters:

  • The input value to be converted to a number;
    • 1.0000
    • 1.000,00
    • 123,10
  • A character that describes what is the decimal separator (can be empty);
  • A character that describes what is the group separator (can be empty);

The output of the functoid will be a Number (integer), example: 1234

BizTalk-Server-Convert-to-Number-Functoid

In this sample described in the picture above we are receiving a string with a decimal: “1.000” and we want to transform into “1”. So, in this case, the functoid configuration will be:

  • the first input will be our value;
  • the second input will be “.” (dot) that describe the decimal separator;
  • and the third input will be “” (empty) because there isn’t any group separator (delimitator);

BizTalk Mapper Extensions UtilityPack: Project Description

BizTalk Mapper Extensions UtilityPack is a set of libraries with several useful functoids to include and use it in a map, which will provide an extension of BizTalk Mapper capabilities.

Where to download?

You can download this functoid along with all the existing one on the BizTalk Mapper Extensions UtilityPack  here:
BizTalk Server Community Extensions Utility Packs GitHub Repository
GitHub

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc. He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.

BizTalk WCF-ORACLEDB error: This is because either (a) ambient transaction is present and the TNS alias is longer than 39 characters

BizTalk WCF-ORACLEDB error: This is because either (a) ambient transaction is present and the TNS alias is longer than 39 characters

Another day, another error to report – still have plenty of them in my backlog to be published, they are an “easy and quick way” to publish something in my blog when I really don’t have much free time to write something different. Today is about “This is because either (a) ambient transaction is present and the TNS alias is longer than 39 characters” error message that I got when I was initially trying to connect for the first time to ORACLE database to insert some data:

Microsoft.ServiceModel.Channels.Common.MetadataException: Metadata resolution failed for OperationId: “http://Microsoft.LobServices.OracleDB/2007/03/STGADMIN/Table/TRANSACTIONS/Insert”. —> Microsoft.ServiceModel.Channels.Common.ConnectionException: Due to an Oracle Client limitation, the adapter failed to open a connection. This is because either (a) ambient transaction is present and the TNS alias is longer than 39 characters, or (b) ambient transaction is present and a non-TNS based URI was used. To resolve this, use a TNS alias to connect to Oracle and make sure it is not more than 39 characters.

at Microsoft.Adapters.OracleDB.OracleDBConnection.OpenConnection(OracleCommonExecutionHelper executionHelper)

— End of inner exception stack trace —

Server stack trace:

at System.Runtime.AsyncResult.End[TAsyncResult](IAsyncResult result)

at System.ServiceModel.Channels.ServiceChannel.SendAsyncResult.End(SendAsyncResult result)

at System.ServiceModel.Channels.ServiceCh.

This is because either (a) ambient transaction is present and the TNS alias is longer than 39 characters

Followed by other similar warning messages:

A message sent to adapter “WCF-Custom” on send port “INSERT_PAYMENTS_WCFORACLE” with URI “oracledb://IP-ADDRESS:PORT-NUMBER/PATH” is suspended.

Error details: Microsoft.ServiceModel.Channels.Common.MetadataException: Metadata resolution failed for OperationId: “http://Microsoft.LobServices.OracleDB/2007/03/STGADMIN/Table/PAYMENTS/Insert”. —> Microsoft.ServiceModel.Channels.Common.ConnectionException: Due to an Oracle Client limitation, the adapter failed to open a connection. This is because either (a) ambient transaction is present and the TNS alias is longer than 39 characters, or (b) ambient transaction is present and a non-TNS based URI was used. To resolve this, use a TNS alias to connect to Oracle and make sure it is not more than 39 characters.

at Microsoft.Adapters.OracleDB.OracleDBConnection.OpenConnection(OracleCommonExecutionHelper executionHelper)

— End of inner exception stack trace —

Server stack trace:

at System.Runtime.AsyncResult.End[TAsyncResult](IAsyncResult result)

at System.ServiceModel.Channels.ServiceChannel.SendAsyncResult.End(SendAsyncResult result)

at System.ServiceModel.Channels.ServiceChannel.EndCall(String action, Object[] outs, IAsyncResult result)

at System.ServiceModel.Channels.ServiceChannel.EndRequest(IAsyncResult result)

Exception rethrown at [0]:

at System.Runtime.Remoting.Proxies.RealProxy.HandleReturnMessage(IMessage reqMsg, IMessage retMsg)

at System.Runtime.Remoting.Proxies.RealProxy.PrivateInvoke(MessageData& msgData, Int32 type)

at System.ServiceModel.Channels.IRequestChannel.EndRequest(IAsyncResult result)

at Microsoft.BizTalk.Adapter.Wcf.Runtime.WcfClient`2.RequestCallback(IAsyncResult result)

MessageId: {DE7ABF70-D6B7-4FC8-A570-5AAE4FFACBB9}

InstanceID: {5842C3C7-6746-4A56-8707-FF53123A4101}

Cause

Non-TNS based URI is not supported under an ambient transaction. If you have to use transactions, you should use the TNS alias.

Note: TNS Alias needs to be less than 39 characters.

Solution

To solve this problem, you need to:

  • Open the send port properties, by double-clicking on the port;
  • Click on “Configure…” button under Transport
  • On the Transport Properties window, select the “Message” tab and then uncheck the “Use Transaction” box under Transactions

This is because either (a) ambient transaction is present and the TNS alias is longer than 39 characters

If you try to resend the message this problem should be solved. Additionally, if this error still persists, you should change the following properties in the oracleDBbinding:

  • “enableBizTalkCompatibilityMode” = True (this is set to false per default)
  • And/or “useAmbientTransaction” = False
Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc. He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.

BizTalk Administration Console: An internal failure occurred for unknown reasons (WinMgmt) fixed by July 30, 2018 Microsoft Security Updates

BizTalk Administration Console: An internal failure occurred for unknown reasons (WinMgmt) fixed by July 30, 2018 Microsoft Security Updates

Last month I wrote a blog post regarding the “An internal failure occurred for unknown reasons (WinMgmt)” error in the BizTalk Server Administration Console caused by the July 10, 2018 Microsoft Security Updates, you can see the entire blog post here: July 10, 2018 Microsoft Security Updates cause errors on the BizTalk Administration Console: An internal failure occurred for unknown reasons (WinMgmt). In which I document a workaround to solve the following problem:

TITLE: BizTalk Server Administration

——————————

Failed to create a BizTalkDBVersion COM component installed with a BizTalk server.

Class not registered (WinMgmt)

——————————

BUTTONS:

OK

——————————

An internal failure occurred for unknown reasons (WinMgmt)

And based on that you wouldn’t be able for example to: restart the host instances from the BizTalk Server Administration console.

Happy to inform you that Microsoft already released a new security update that will fix this problem.

Cause

As official documentation (https://support.microsoft.com/en-gb/help/4345913/access-denied-errors-after-installing-july-2018-security-rollup-update) state: Applications that rely on .NET Framework to initialize a COM component and that run with restricted permissions may fail to start or run correctly after you install the July 2018 Security and Quality Rollup updates for .NET Framework.

Microsoft .NET Framework runtime uses the process token to determine whether the process is running within an elevated context. These system calls can fail if the required process inspection permissions are not present. This causes an “access denied” error.

After you install any of the July 2018 .NET Framework Security Updates, a COM component fails to load because of “access denied,” “class not registered,” or “internal failure occurred for unknown reasons” errors.

So, the cause of these problems was security updates that were released by Microsoft on July 10, 2018.

Solution

On July 30, 2018 Microsoft released new Security Updates https://www.catalog.update.microsoft.com/Search.aspx?q=4346877 that will fix these issues.

An internal failure occurred for unknown reasons (WinMgmt) fixed

To solve my problem, I:

  • Applied/installed all the possible available updates on my BizTalk Server machine;
  • I manually download and installed the Security Update marked in the picture above;

After I restarted my BizTalk Server machine, this problem was fixed.

Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc. He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.

BizTalk WCF-ORACLEDB error: PL/SQL: ORA-00917: missing comma

BizTalk WCF-ORACLEDB error: PL/SQL: ORA-00917: missing comma

In the last months I have been working with ORACLE adapter, mainly doing direct insert operations on ORACLE tables and as you might imagine, I found some errors that I find interesting to document. One of these errors was PL/SQL: ORA-00917: missing comma.

The first time I try to directly insert data inside a table – without using any stored procedure, that I normally use in SQL Server or other implementations that I worked with ORACLE – I got the following error:

A message sent to adapter “WCF-Custom” on send port “SEND-PORT-NAME” with URI “oracledb://IP-ADDRESS:PORT-NUMBER/PATH” is suspended.

Error details: Microsoft.ServiceModel.Channels.Common.TargetSystemException: ORA-06550: line 2, column 677:

PL/SQL: ORA-00917: missing comma

ORA-06550: line 2, column 1:

PL/SQL: SQL Statement ignored —> Oracle.DataAccess.Client.OracleException: ORA-06550: line 2, column 677:

PL/SQL: ORA-00917: missing comma

ORA-06550: line 2, column 1:

PL/SQL: SQL Statement ignored

at Oracle.DataAccess.Client.OracleException.HandleErrorHelper(Int32 errCode, OracleConnection conn, IntPtr opsErrCtx, OpoSqlValCtx* pOpoSqlValCtx, Object src, String procedure, Boolean bCheck, Int32 isRecoverable)

at Oracle.DataAccess.Client.OracleException.HandleError(Int32 errCode, OracleConnection conn, String procedure, IntPtr opsErrCtx, OpoSqlValCtx* pOpoSqlValCtx, Object src, Boolean bCheck)

at Oracle.DataAccess.Client.OracleCommand.ExecuteNonQuery()

at Microsoft.Adapters.OracleCommon.OracleCommonUtils.ExecuteNonQuery(OracleCommand command, OracleCommonExecutionHelper executionHelper)

— End of inner exception stack trace —

Server stack trace:

at System.Runtime.AsyncResult.End[TAsyncResult](IAsyncResult result)

at System.ServiceModel.Channels.ServiceChannel.SendAsyncResult.End(SendAsyncResult result)

at System.ServiceModel.Channels.ServiceChannel.EndCall(String action, Object[] outs, IAsyncResult result)

at System.ServiceModel.Channels.ServiceChannel.EndRequest(IAsyncResult result)

Exception rethrown at [0]:

at System.Runtime.Remoting.Proxies.RealProxy.HandleReturnMessage(IMessage reqMsg, IMessage retMsg)

at System.Runtime.Remoting.Proxies.RealProxy.PrivateInvoke(MessageData& msgData, Int32 type)

at System.ServiceModel.Channels.IRequestChannel.EndRequest(IAsyncResult result)

at Microsoft.BizTalk.Adapter.Wcf.Runtime.WcfClient`2.RequestCallback(IAsyncResult result)

MessageId: {29C0CAD2-1D48-4318-8C86-E4A4E38FBD1C}

InstanceID: {F64C65F2-99F2-410E-A92E-418D146C16C9}

PL/SQL: ORA-00917: missing comma error

Cause

When you import the Insert (or other operation) schema from a specific table, unlike SQL, that only brings fields for you to fill, ORACLE schema will have:

  • Elements (fields) – that are the columns present in that specific table
  • and each Element will have an optional “InlineValue” attribute.

BizTalk-WCF-ORACLE-insert-operation-schema-structure

The element, as you can imagine is to send the value data that you want to insert in that specify column of the database but what is the InlineValue attribute? And what is this for?

InlineValue

For all simple data records in a multiple record Insert operation, you can choose to override the value of a record by specifying a value for an optional attribute called “InlineValue“. The InlineValue attribute can be used to insert computed values into tables or views such as populating the primary key column using a sequence or inserting system date (using SYSDATE) into a date column. Again, this is an optional attribute and is available for all simple data records in a multiple record Insert operation.

Basically, in other words, it allows you to call ORACLE PL/SQL functions like SYSDATE, TO_DATE or others for that specific column. And you don’t need to insert any data on the element, again, by specifying the InlineValue attribute this will override the value that you insert on that element.

Why you are getting the PL/SQL: ORA-00917: missing comma error?

This error typically occurs when you are mistakenly putting the data to be inserted in the “InlineValue” attribute and not in the elements:

<ns0:PAYMENTSRECORDINSERT xmlns:ns0="http://Microsoft.LobServices.OracleDB/2007/03/STGADMIN/Table/RETAIL_PAYMENTS">
  <ns0:RECORD_UNIQUE_ID InlineValue="12345" />
  <ns0:BOOKING_REF_NUMBER InlineValue="12345" />
  <ns0:SOURCE_SYSTEM InlineValue="TEST" />
  <ns0:PAYMENT_METHOD InlineValue="MONEY" />
  <ns0:CURRENCY_CODE InlineValue="EUR" />
  <ns0:REFERENCE_CODE InlineValue="1234" />
  <ns0:PAYMENT_TRANSACTION_ID InlineValue="1234" />
  <ns0:INTERFACE_STATUS InlineValue="N" />
</ns0:ADMM_RETAIL_PAYMENTSRECORDINSERT>

Solution

The solution, in this case, is very simple: you need to place the data on the existing elements of the schema instead of using the InlineValue attribute of the element:

<ns0:PAYMENTSRECORDINSERT xmlns:ns0="http://Microsoft.LobServices.OracleDB/2007/03/STGADMIN/Table/RETAIL_PAYMENTS">
  <ns0:RECORD_UNIQUE_ID>1234</ns0:RECORD_UNIQUE_ID>
  <ns0:BOOKING_REF_NUMBER>1234</ns0:BOOKING_REF_NUMBER>
  <ns0:SOURCE_SYSTEM>TEST</ns0:SOURCE_SYSTEM>
  <ns0:PAYMENT_METHOD>MONEY</ns0:PAYMENT_METHOD>
  <ns0:CURRENCY_CODE>EUR</ns0:CURRENCY_CODE>
  <ns0:REFERENCE_CODE>1234</ns0:REFERENCE_CODE>
  <ns0:PAYMENT_TRANSACTION_ID>1234</ns0:PAYMENT_TRANSACTION_ID>
  <ns0:INTERFACE_STATUS>N</ns0:INTERFACE_STATUS>
</ns0:ADMM_RETAIL_PAYMENTSRECORDINSERT>
Author: Sandro Pereira

Sandro Pereira lives in Portugal and works as a consultant at DevScope. In the past years, he has been working on implementing Integration scenarios both on-premises and cloud for various clients, each with different scenarios from a technical point of view, size, and criticality, using Microsoft Azure, Microsoft BizTalk Server and different technologies like AS2, EDI, RosettaNet, SAP, TIBCO etc. He is a regular blogger, international speaker, and technical reviewer of several BizTalk books all focused on Integration. He is also the author of the book “BizTalk Mapping Patterns & Best Practices”. He has been awarded MVP since 2011 for his contributions to the integration community.