Hi,
This is one of the performance issue faced with one of our Biztalk Applications. The applications gets multiple .dat files as input. Each of these files contains multiple detailed records.
Biztalk recieves the files. For each file, the records are split into individual records and for each record, and orchestration is incited. Say for example, if the file contains 100 records, 100 orcestrations are created and each of the orchestration is trying to update the record into the database. Ofcourse, the records get inserted, but performance becomes an overhead. This really becomes a serious problem when multiple files are recieved and more so, if the files contain multiple records (huge file size). At this rate, it nearly takes an hour to update about 25000 records into the database.
Any solution or idea or alternative that will help improve performance and update database at a much quicker rate?
Help!!!