any help or pointers on this would be greatly appreciated.
I am seeing some behaviour from our BizTalk server that I can't get to the bottom of. I'm presuming this is all around throttling but I can't see anything going on. Here's the scenario:
Inbound CSV files (most have approx 35k lines) get picked up by a receive location and port (each file has its own port and location created) with a flatfile dissassembler pipeline and map. The file gets processed into the message box and picked up by an orchestration where more general processing is done on the message. This is then sent back to the messagebox (via correlated send/receive port) where an SQL send port picks up on a correlated type and inserts into a SQL table. The response is processed by the orchestration, then a new SQL query is ran to ensure that the number of records sent = number of records inserted.
Server is very chunky, 12Gb RAM, 2x Octocore CPU's 2.8GHz (16 cores in total)
I'm getting these issues:
1. The orchestration processing the general message sits there showing Queued Awaiting Processing.
2. Out of system memory process
It seems like the server is just sat there "twiddling its thumbs" for no apparent reason.
Any help appreciated.
I have a similar problem, some times an orchestration wait for a message, but the message is in the messagebox :(
when i restart the host the message is processed without problems
Hi Terry, do you have some update or workaround related with your issue?
Hi, nothing has come to light yet, I'm actually looking at using SSIS to import the transformed XML rather than using the SQL adaptor. I'll let the pipeline do all the processing and dump it out to a file send port for the SSIS package to process.
Bit of a cop out, as I would prefer BizTalk to do the processing from start to finish, but at the moment it just doesn't seem like an efficient way of doing it.
Reading into your post following are some pointers that might be causing this problem.
- You have a map in your solution by any chnace is there any inline XSLT being used if yes then with large files this can cause out of memory exception.
- what are the host settings for the host processing this orchestration. Maybe the settings can be spiked up to use more memory default value is 25 maybe try increasing this value.
thanks for the post. There are no XSLTs in the maps, just very basic.
The memory is currently on 50%, I had read about increasing the memory but it didn't seem to have any benefit.
I'd also read that because the host the SQL adaptor sits in is 32bit but on a 64bit machine, the percentage only uses a percent of 2Gb, whereas if you specify the actual amount you can use up to the (almost) 4Gb - so I had it set at 3Gb at one point. Again, no visible benefit.
It would appear to me that the root of the problem is processing the response from the SQL adaptor.
The stored procedure the adaptor calls returns "success" for each row inserted, so if I'm inserting a file with 35k lines, I get a response messages with 35k body parts, each saying "success", which is where the problem seems to be because opening a response with even only a few hundred body parts takes a while, and opening a 35k body part message basically kills the console.
Is there a more efficient way of processing such a file through the SQL adaptor? I'm not concerned about getting a success response for each line, because everytime I've had a duff line in the files, the process suspends anyway, so I will always know if there has been a problem with that file.
Is it possible to just "dump" the file and let it get on with it rather than having to have a response?
Any pointers much appreciated.