BAM tracking data not moved to BAM Archive database

BAM tracking data not moved to BAM Archive database

There are a few really good blog post that explains BAM – like this from Saravana Kumar and this by Andy Morrison. They both do a great job explaining the complete BAM process in detail.

This post will however focus on some details in the last step of the process that has to do with archiving the data. Let’s start with a quick walk-through of the whole process.

BAM Tracking Data lifecycle

  1. The tracked data is intercepted in the BizTalk process and written to the “BizTalk MsgBox” database.

  2. The TDDS service reads the messages and moves them to the correct table in the “BAM Primary Import” database.

  3. The SSIS package for the current BAM activity has to be triggered (this is manual job or something that one has to schedule). When executing the job will do couple of things.

    1. Create a new partitioned table with a name that is a combination of the active table name and a GUID.

    2. Move data from the active table to this new table. The whole point is of course to keep the active table as small and efficient as possible for writing new data to.

    3. Add the newly created table to a database view definition. It is this view we can then use to read all tracked data (including data from the active and partitioned tables).

    4. Read from the “BAM Metadata Activities” table to find out the configured time to keep data in the BAM Primary Import database. This value is called the “online window”.

    5. Move data that is older than the online window to the “BAM Archive” database (or delete it if you have that option).

Sound simple doesn’t it? I was however surprised to see that my data was not moved to the BAM Archive database, even if it was clearly outside of the configured online window.

So, what data is moved to the BAM Archive database then?

Below there is a deployed tracking activity called “SimpleTracking” with a online window of 7 days. Ergo, all data that is older than 7 days should be moved to the BAM Archive database when we run the SSIS job for the activity.

If we then look at the “BAM Completed” table for this activity we see that all the data is much older than 7 days as today’s date is “13-11-2009”.

So if we run the SSIS job these rows should be moved to the archive database. lets run the SSIS job. Right?

But when we execute the SSIS job the BAM Archive database is still empty! All we see are the partitioned tables that were created as part of the first steps of the SSIS job. All data from the active table is however moved to the new partitioned table but not moved to the Archive database.

It turns out that the SSIS job does not at all look at the the “Last Modified” values of each row but on the “Creation Time” of the partitioned table in the “BAM MetaData Partitions” table that is shown below.

The idea behind this is of course to not have to read from tables that potentially are huge and find those rows that should be moved. But it also means that it will take another 7 days before the data in the partitioned view is actually move to the archive database.

This might actually be a problem if you haven not scheduled the SSIS job to run from day one and you BAM Primary Import database is starting to get to big and you quickly have to move data over to archiving. All you then have to is of course to change that “Creation Time” value in the BAM Metadata Partitions table so it is outside of the online window value for the activity.

Black Friday shopping early with the Windows Azure Platform TCO/ROI Analysis Tool

Jim Nakashima notes on his blog that Microsoft has just released a “Windows Azure Platform TCO/ROI Analysis Tool” which looks a little something like this:

 

I gave the tool a whirl and found some interesting bits.

The GOOD

For a technology that’s currently in CTP, it’s great that Microsoft is being pro-active in putting out this tool. As numerous as questions surrounding the cloud are, taking a first stand in noting “These are some of the factors you’ll have to consider when making decisions” is a step in the right direction.

The tool supports multiple currencies, which appear to be updated with fluctuations in currency markets. We live and work in a global IT environment — enough said.

The tool provides multiple base configurations for common scenarios, such as a simple web application, or transactional compute power for a number crunching application. This is helpful if you’d rather go with some common guidelines and tweak them to your needs.

The BAD

There are four models when it comes to hosting: On-Premises, Hosted (at a Provider), Cloud, and Software-as-a-Service. The TCO/ROI tool only provides for comparison shopping of On-Premises to Azure. It’s important to note that the cloud comparison is purely Azure-based, and doesn’t allow you to compare/contrast offerings from other providers.

The tool also makes some assumptions about which Microsoft SKUs you’re using behind the scenes, and that you’re virtualizing everything on premise. There’s no ability choose which Virtualization provider (VMWare vs Hyper-V) and anyone who’s ever had to walk through licensing of products for very specific scenarios will be able to tell you that a second check from a TAM or PAM is a requirement.

The UGLY

The tool is prefaced with a statement that shifts the tone of an otherwise marketing-heavy web site to one Microsoft Legal clearly had its hand in: “MICROSOFT MAKES NO WARRANTIES, EXPRESS, IMPLIED OR STATUTORY, AS TO THE INFORMATION IN THE TCO AND ROI CALCULATOR REPORT.” If one of the intents of the tool is to sell Azure services and drive adoption, helping to assuage concerns is the rightful tact. We’re all new to this arena, help us by guiding us and answering questions for us. This statement has the opposite effect, which is, hands-off. The connotation is “We make no promises that anything that this tool says is valid, that it will help to save you money, that it’s the right approach, or that you should really trust us with your computing needs. We could be hiding things, or we could have left things out.”

In the end, the TCO/ROI tool is a step in the right direction, but its output is of little or no real value, due to the limitations of the inputs. I’d be very interested in seeing v2, though.

An UPDATE to Wednesday’s post on Data Privacy in the Cloud: a friend had noted that Microsoft’s paper seemed quite light. In the interest of being fair and balanced, and because comparison shopping seems to be the thing of the day, Amazon Web Services’ paper may be found here. It’s pretty striking how technically detailed Amazon’s paper is where Microsoft’s is not, especially given that AWS’s paper was published first.

 

Microsoft StreamInsight: The $5 Tour at BaltoMSDN on 11/18

I’ll be speaking at the Baltimore user group, BaltoMSDN, on Wednesday, November 18, 2009 at 6:30 p.m. on Microsoft StreamInsight.

The presentation, Microsoft StreamInsight: The $5 Tour, is an extended version of my talk from CMAP Code Camp, and includes additional demos, as well as a longer discussion on building adapters and developing applications based on CTP2.

With (American) Thanksgiving only two weeks away, StreamInsight makes for a great conversational topic after a big meal.

New feature in SP1 for BizTalk 2006 R2: Override group certificate to sign outgoing messages

Service Pack 1 for BizTalk 2006 R2 is now released as beta, and includes a number of hotfixes.


(See http://support.microsoft.com/kb/974563 )


 


SP1 also includes a new feature, namely the ability to configure different certificates for signing outgoing AS2 messages for different  parties.


Prior to SP1 for R2, we could only set one certificate for signing outgoing AS2 messages per BizTalk  group.


 


After installation of SP1, you will see in the AS2 party properties a new tab, where you can override the group certificate.


If you do not choose to override the group certificate, the group certificate will be used by default.


 


We are already describing how to use this in R2 with SP1. See the topic ” To configure a certificate for signing outgoing AS2 messages for a specific party” in the R2 documentation.


http://msdn.microsoft.com/en-us/library/bb728096(BTS.20).aspx


 


We are currently working on porting this feature to BizTalk 2009.


 


 


Manuel Stern

Validating Incoming Data Using the BizTalk Business Rules Engine

Validating Incoming Data Using the BizTalk Business Rules Engine

A project team recently asked me if they could use the BizTalk BRE to validate incoming data. I asked what sort of validation we were talking about, and it came down to four areas:

Setting default values when the incoming fields were empty
Doing look-ups based on existing data and populating related fields
Doing concatenation of fields
Catching any […]

Recent and upcoming BizTalk User Group Sweden events

Some time ago, back on the 20th of November, Richard Seroter visited BizTalk User Group Sweden and talked about BizTalk, SOA and leveraging the cloud. His talks are now live on channel9, part 1 and part 2. A short blogpost explaining the talk is here.

It was great fun to have Richard visit, although some (most) of the laughter and clapping that your hear on the videos is really due to me making a pantomime Richard imitation outside of the frame of the video and should not be credited to him 😉

Getting the videos published wasn’t lightning fast due to some unfortunate circumstances and personnel changes. Hopefully that will be all ironed out by November 26th when Charles Young swings by us again, this time to talk about the Business Rules Framework. I’m sure it’ll be a classic. Take the chance to say “I was there”. There are still slots left. Sign up here.

Should you happen to pass by Sweden and have an interesting Connected Systems topic that you would like to talk about feel free to drop us a line.

Or as Richard commented his visit on his blog through this sentence taken totally out of it’s context 😉 “If you get the chance to visit this user group as an attendee or speaker, don’t hesitate to do so.