by community-syndication | Sep 23, 2008 | BizTalk Community Blogs via Syndication
As Aaron pointed out a while back, Pluralsight has joined up with the WCF / WF folks at Microsoft to create a series of screencasts to give developers an introduction to these technologies. My first contribution to this effort is all about creating a your first sequential workflow. The screencasts shows the basics of creating the workflow, adding activities from the framework and custom activities.
RSS feed for all WF screencasts

by community-syndication | Sep 23, 2008 | BizTalk Community Blogs via Syndication
An issue that I recently came across was when we upgraded our servers from BizTalk 2006 to BizTalk 2006 R2 is that we were unable to see the new EDI features. This is the context menu I saw when right clicking the parties:
Which is incorrect, I should be seeing this context menu:
The resolution is that after you upgrade the server, you need to need to modify the installation in the add/remove programs and choose Modify:
Check the BizTalk EDI/AS2 Runtime:
Afterwards, you need to go into the BizTalk Configuration and configure the new EDI components (I already did this):
Then you get all of the components you are looking for.
by community-syndication | Sep 22, 2008 | BizTalk Community Blogs via Syndication
I’ve been meaning to write a blog on this subject for quite some time… this blog will explain authentication and authorization of incoming SOAP messages to BizTalk. Although it might seem like a simple subject to some, I’ve seen enough web apps deployed w/o authorization that I figure it’s worth writing about…
Here’s the short answer […]
by community-syndication | Sep 22, 2008 | BizTalk Community Blogs via Syndication
Content is king! Just ask anyone who is serious about the web and they will tell you, unless they’re in the midst of getting you to sign away your content at the moment. As a consultant or other person on the go, you should be ready to capture content at a moment’s notice. Now, I do a ton of community work, so I probably take this to extremes, but you should consider these options.
Video Capture
Video Camera
A picture is worth a thousand words, but a video is king. Be it user groups, scrums, product demos or anything else having a video camera close at hand can solve a ton of problems. I carry a nice cheap video camera in my backpack which I picked up during a Black Friday sale. My unit is a Panasonic PV-GS85, which has a great built in LED light which can help with close up camera work in dark rooms. The model is not important, they key here is that having some sort of video capture really helps. There are three levels of video cameras these days. Flash, Tape, and Hard Disk.
Mine is a tape unit, that records to Mini-DV. This means I can record a lot, about 1.5 hours per tape, but it means I’ve got to rip the tape back to digital files at 1:1 speed when I’m done, which means 1.5 hours of recording is 1.5 hours of ripping.
Flash units store less, but also store as digital files so they transfer to a computer much faster. These can also be very small sometimes, which is nice. They can also be cheaper than tape.
Hard disk units are more expensive, and about the same size as tape units but they also transfer to your computer faster because, again, they are storing files to that hard disk.
On all of these, when you get a unit realize you’re not trying to film a movie, O.K. quality will likely be fine. Good enough, is by definition, good enough.
Mono-pod
Video cameras are great, but shaky video isn’t. I carry a mono pod in my backpack that I can whip out whenever I need to stabilize a video. These gizmos are handy, but remember that mono means 1, that’s 1 leg, which means no walking away. For walking away you’ll need…
Large Tripod
I bought a “large” tripod at my local camera shop. It stays in the car, to big for the backpack, but it is still relatively nearby if I need to record a longer session. Usually I know this on my way in and will carry it with me. I’ve made a habit of recording our company meetings for Sogeti so that they can be shared on our SharePoint portal. This has worked great for me.
Small Tripod
But sometimes you want to walk away, and haven’t got time to get the large tripod. For this, we use the wonderful QSX 1001 tripod. This tripod packs up into a 2 inch diameter, 7 inch long tube. It rides in the water bottle pocket of my backpack, and is always ready to be pulled out. Now, even fully extended this only rises to a height of about 12 inches, but resting on a table this is perfect for interviews.
Web Camera
Sometimes the full camera isn’t what you need, instead its time to participate in a Live Meeting session or other Webcast and you just need a web camera. I carry a Microsoft LifeCam NX-6000 for this. I’m a fan of this unit, but it’s drivers are enough to drive me crazy. The drivers install a service (MSCAMSVC.exe) which can start consuming tons of CPU cycles even when the camera is disconnected. I now keep the service disabled until I plug in the camera. What I really need is a better Web Cam at some point that lacks these problems. This unit is workable, just realize you’re going to have to seize control of that service of your box will seriously suffer.
Audio Capture & Playback
Sometimes you don’t need video, and audio alone will be plenty. Sometimes you just want to listen to some tunes while you’re cruising along to your code. Here are my tools for this.
MP3 Player
We all need tunes, and we want them on the go. While I own Zunes and iPods, as Alton Brown says, I hate unitaskers. The Creative Zen V Plus, I’ve used for years and remains the staple generic MP3 player in my work backpack because it not only plays WMAs, MP3s, and Audible audiobooks, it also has a built in recorder. Now, this isn’t super high quality audio, but if you’ve got a morning Scrum that you need to record, or if you need to write yourself a verbal note, then this is the unit for you.
Audio Recorder
If podcasting is your goal, then the Zen V Plus won’t be up to the quality you want. You need a good quality recorder which can capture audio in multiple different ways. For my purposes, this is the Zoom H4. This unit can record with its two built in microphones, it can accept two direct inputs from guitars or other instruments, or it can take two XLR microphones and provide them up to 48V of phantom power. Best of all, it just plugs into your computer for retrieval of the information, and stores to common SD memory sticks. It also comes with its wall socket adapter, or can run portably with 2 AA batteries. I use this for my podcasting efforts, and have been very happy with it so far.
Listening
When you’re done capturing, you’re gonna want to listen to all that wonderful content. I’ve raved in the past about the Logitech Freepulse headphones, and I’d still recommend them. The customer power scheme still bothers me though.
Image Capture
There are more than cameras to image capture…
Cameras
No cameras are in my backpack, at the current time the camera in my phone has always been enough for what I’ve needed. When I vacation I might add my wife’s camera, but otherwise camera phones are the key here.
Scanners
The technology most locking down the modern mobile office is fax. There are a bunch of service options for this, but those services don’t also help with the problem of being handed a piece of paper that you want digitized into OneNote or other computer note taking software. For this, I carry the Pentax DS Mobile 600 which is a wonderful, USB powered, color scanner. This unit will rip through your pages of hand written notes and digitize them for your digital consumption. Fair warning, at the current time there are only 32 bit drivers for this unit.
by community-syndication | Sep 22, 2008 | BizTalk Community Blogs via Syndication
So you need only flip back about a year ago on this blog to find that I am clearly a fan of pushing people to contribute to their community, and their world, in a positive way. I was a huge proponent of the We Are Microsoft event last January that has since morphed into the GiveCamp initiative nationwide. I’ve been thrilled to watch that effort franchise itself around the United States, and am sure that it will reach further will. GiveCamp’s are based on the idea of using our skills, as developers, to impact a local charity in a positive way. This is an incredibly high touch, high impact donation of skills that many charities simply lack. But at its heart, it is about giving back. So what can we do between GiveCamps? How can we impact our community and our world in a tangible way without picking up and doing three years of field work around world?
Well the truth of the matter is that in between we need to work closely with those people who are on the ground around the world and know where impact can be created. Kiva.org is an organization that lets us do exactly that. They are a micro-financing group who helps bring small loans to people throughout the world. While a $50 dollars might buy you a new video game for your XBox 360, it will also help fund two different loans to entrepreneurs around the world.
What is Micro-Financing?
Micro-Financing is about helping lend money directly to entrepreneurs around the world which can make a real impact to their local economy. It is based on the principals of capitalism, and can be explained much better by sites such as WikiPedia or Kiva themselves.
How does Kiva help?
Simple, they work with partner organizations on the ground to make known to you the needs of these entrepreneurs. They provide a way you can work with others to finance such a loan. They handle collecting those funds and returning them to you.
GiveCamp @ Kiva.org
I’ve created what Kiva calls a Lending Team for GiveCamp on Kiva’s website. Through this team we can contribute to these entrepreneurs as well as track our impact over time. This is a long term effort, but even $25 can really help change the lives of those involved. Go to Kiva.org, create an account, join the team, and look for someone in need of a loan that you would be willing to back. I’ve already got a handful of loans out there. A word of advice though, join our Lending team first, before you fund a loan, because it will only be counted towards the team if it is made after you join.
Absolutely no skin in this game…
I want to be very clear, there is ZERO personal profit motivation for me in this. Teams are merely a community building effort and there is nothing about making a loan as part of the GiveCamp team that in any way accrues to the personal benefit of myself, or anyone else.
by community-syndication | Sep 21, 2008 | BizTalk Community Blogs via Syndication
In this article Martin Fowler looks at SOA and implementing Agile around the concept. http://martinfowler.com/bliki/EvolutionarySOA.html Subscribe in a reader Join my blog network on Facebook Blog Networks Share this post: email it! | bookmark it! | digg Read More……(read more)
by community-syndication | Sep 21, 2008 | BizTalk Community Blogs via Syndication
Auckland Connected Systems User Group
Building SOA With Microsoft Technology
Ulrich Roxburgh
Ulrich Roxburgh has worked for Microsoft Consulting Services for 9 %u00bd years, in various capacities ranging from Senior Consultant to Managing Consultant, in both New Zealand and Australia. He now works as the main consultant for Services2 Ltd. providing premium consultancy services in the areas […]
by community-syndication | Sep 21, 2008 | BizTalk Community Blogs via Syndication
Article Source: http://geekswithblogs.net/michaelstephenson
Following a recent post about the different approaches to caching you might consider when implementing reference data mapping in BizTalk one of the things that stands out most was that the solutions where a team had used a caching approach often resulted in them not using the BizTalk Cross Referencing features. As I’ve mentioned many times I prefer to use this unless there is good cause not to (there are reasons where you might not want to) but I feel development teams often ignore or don’t consider the impact of adding custom databases to a solution without consideration for the extra work this requires in development, testing , deployment and management.
In most cases why do you want to do this when you already have a data store designed for this purpose? One criticism I would make of BizTalk is that the product does not do a very good job of making it easy for people to use the cross referencing featured from a developer experience but these can all be worked around with few problems.
Anyway I have decided that I will produce this sample showing how I have combined the use of NCache and BizTalk Cross Referencing to get a solution which does not need custom databases yet can still have a high performance caching solution which will not increase the BizTalk hosts process memory unnecessarily. The sample can be downloaded from the bottom of the article.
Prerequisites for the sample
You can obviously review the code in this sample, but if you want to run it you will need to do the following things:
- Install NCache Express Edition
NCache Express Edition is available free from the following link http://www.alachisoft.com/ncache/. I assume you will be installing this to the default location, but if not you might need to modify the msbuild script where I configure NCache.
- Modify Cross Referencing Setup
In the SetupFiles.xml file in the solution it contains the xml used to setup the cross referencing data in BizTalk. This file requires absolute paths to work so you will need to tweak these to suit your location. The below picture shows what the xml looks like.
Setting up the sample
In the sample you will notice there is a file which is called Setup.cmd. If you run this file it will perform the appropriate actions to configure things for this sample. The actions it will take are as follows:
- Stop the cache in NCache if it is already running
- Clear the BizTalk cross referencing tables
- Stop the NCache windows service
- Copy the pre-configured NCache config files to their appropriate places to configure NCache with the cache we will use in this sample
- Start the NCache windows service
- Start the custom cache
- Load the BizTalk cross referencing tables using BTSXRefImport
These actions are all done in an msbuild script (picture below) which should make it easy for you to see how this is done.
You should now be able to run the sample.
My Cross Referencing Component
To keep the sample simple I have developed a component which will provide an interface which is the same as that provided by BizTalk cross referencing. I provide a class called CrossReferencingFacade which implements the fa%u00e7ade pattern to give you an easy way to obtain the common and application specific id’s. The below picture shows this:
There is also a test in the test project which shows how to consume this component. It is as easy to consume as the BizTalk cross referencing dll. If you look in the CrossReferencingManager class you will see there are two key methods which are discussed below:
This will use some data access code to retrieve all of the cross reference data for one specific type of cross reference (xrefId) for example all of the mappings for Product Type. It will then return them to the calling method.
This method will check the cache to see if the data is already there for the requested cross reference data type. If present it will be returned from the cache, and if not the data will be loaded using the LoadXRefIDData, and then placed in the cache.
The result of this means the data is cached once for both the GetAppID and Get CommonID methods.
One interesting bit on this (and there may be better ways to do this) is that to allow you to search for the appropriate mapping data from the same source by both CommonID and AppId I have held the data in a container object which houses a dictionary of the reference data with a unique key for each one, and then I also have 2 dictionaries of the app specific keys and common id keys. This just makes it possible to hold the data just once but search for it in different ways. As mentioned I’m sure if I have a think about this there are better ways to do this but it will do for this sample. (note although this last bit may have sounded overly complicated this is encapsulated so the consumer does not need to care about this)
The NCache bit
So from the above hopefully you can see I have provided a handy way to use BizTalk cross referencing within this sample. The next thing to discuss is NCache. I believe there are a number of additional features which come with the Enterprise version such as security features and tools to manage caches so for any production usage I would definitely recommend that version. For the purposes of this sample the Express edition is more than sufficient.
You can see from the below picture that the code to interact with NCache is very simple.
(Note: In the above picture you can only partly see it, but the cache allows you to insert objects with expiration parameters and also a cache dependency)
With NCache one of the things that I like is I don’t need to worry about configuration within my application, so you will notice there isn’t any app.config files in the sample which are used by the consumers of the cache. That said there is some configuration for the caching service. You will see in the NCache folder there is some config which controls how your caches are setup. The below picture from the config.ncconfig file shows how I have configured my cache for this sample.
You will notice here that I’m able to control if my cache runs in process or out of process which is how I’m able to move the cached data outside of my Biztalk host process and there are a bunch of other possible settings. This configuration is held along side the caching service.
Plugging it into BizTalk
This component is now very easy to add to a BizTalk implementation by using the call external assembly feature of the scripting functoid to call the component. You will now be able to use the cross referenced features of BizTalk but with out of process caching of the data.
Summary
Hopefully you will see that it is not that difficult to implement a good caching solution which has addresses a combination of the considerations I discussed in my previous article. I quite like this approach and based on my limited experience of the different caching systems available I would probably at present choose NCache over Memcached because it is an established 3rd party system which comes with additional tools and features to support it. That said I will be keeping an eye on the “velocity” project as I think this will definitely be one to watch for the future.
If you have any experiences with this I would be interested to hear your thoughts. The sample is available below:
by community-syndication | Sep 21, 2008 | BizTalk Community Blogs via Syndication
Article Source: http://geekswithblogs.net/michaelstephenson
I’ve been asked the same question a few times recently by a couple of BizTalk projects about how to map their reference data. When this question comes up we often get involved in a discussion about the pros and cons of caching the reference data and increasing memory usage versus hitting the database every time.
As a rule I tend to use the BizTalk Cross Referencing features for this data mapping unless there is a specific requirement which requires some custom approach. I’ve blogged about this kind of thing a few times before but I thought its worth a post with some thoughts on the different approaches I’ve seen used when people have wanted to use caching.
I mentioned in a previous post that the Value cross referencing features already implement a simple caching mechanism. In my opinion though the value cross referencing is aimed more at mapping data type values between types of systems rather than business reference data which would be held in instances of systems which is what I feel the ID cross referencing is aimed more at.
Anyway when it comes to this design decision the things people are usually trying to balance are as follows:
- Performance – If I have a lot of things to map I don’t want to be hitting the database thousands of times
- Performance – If a I cache the reference data is there a risk it will consume a fair bit of memory and potentially cause throttling based on host process memory threshold being hit
- Managability – If I cache the data it will have an instance of the cache in each host instance that uses it. How will I ensure these stay in sync
- Managability – Caching will mean I have to restart all the hosts when the data changes
There are a number of possible ways to solve this problem and each have their own considerations which are discussed in the rest of this article.
Simple Singleton Approach
This is probably the most common approach I’ve seen. In this approach I’ve normally seen a custom database implemented to manage the reference data. The developer would then implement a custom data access method and a singleton which would be used to control access to the reference data. This is a pretty standard use of the singleton pattern. In this approach I think some of the considerations which need to be made are:
Pro’s
- Fast access to the data
- Easy to implement in terms of the C# coding
Con’s
- In most cases there is additional development of a database to manage the data. This then involved additional development/testing/deployment and management work
- The data is cached in the host process so you need to watch for the impact on the process memory of the BizTalk host
- If you access this reference data from BizTalk maps running in different hosts then you may end up with multiple instances of the cached data on each server
- By default your cache usually will not detect changes to the underlying data, however with additional coding you can monitor the custom data and update any changes
- In most cases the hosts need to be restarted to pick up changes
- The cache will not be cleared when the data is no longer used
Caching the Response from a Web Service
Sometimes I’ve seen an approach where a custom database has been implemented then a web service fa%u00e7ade has been implemented on top of it. The web service will access the data and return it. In consuming this from BizTalk a C# assembly has been developed which uses the web service to get the reference data which is then consumed by a map.
Pro’s
- The caching is outside of the BizTalk process
- The caching can be relatively easily configured
- If the web service is located on the BizTalk box then a local machine hop would be quicker than going remotely, and also with WCF you could optimise this further using appropriate channels
Con’s
Using the HTTPCache
In this approach I’ve normally seen it implemented in the same way as the singleton approach above. The key difference is that the reference data is usually held locally in a static hash table in the singleton approach where as in this approach the HttpCache object from the System.Web namespace is used. This gives a couple of options around a sliding and absolute expiration which will remove unused data from the cache helping to control the memory usage. You can also add one of the .net cache dependency objects which would allow you a way to detect changed and refresh the cache.
Pro’s
- Would be fast access to data
- Relatively simple to implement ways to detect changes
- Ability to clear the cache for unused data
Con’s
Using Enterprise Library/Caching Application Block
Enterprise Library has a caching block which provides a number of features which could help you solve this problem. One of the key benefits of enterprise library is that it supports different types of stores for the cached data including:
- Null – means just stored in memory
- Database
- Isolated Storage
If I remember right the cache supports the same features as the HTTPCache approach which allows you to have a dependency and also expirations. There is an article at the following location which discusses using Enterprise Library Caching in BizTalk http://www.malgreve.net/2007/07/using-enterprise-library-in-biztalk.html.
Enterprise Library can also integrate with external backing stores to support out of process caching.
Pro’s
- Ability to abstract the caching store from the consuming code
- Standard caching feature set
Con’s
- Again usually some requirement for custom data store for reference data
- Enterprise Library usually required lots of configuration to setup and manage
- Most commonly cached in process so near to be aware of memory usage
Out of Process Caching
One approach I quite like involves caching the data outside of the BizTalk process. This provides the benefit that you can cache without having to worry about the impact on the BizTalk process memory usage. There are a number of caching tools which you can use to help here such as:
Alachisoft offer an express version of their caching product which is free and a version for a relatively small cost which comes with some management tools for their distributed caching system.
Memcached is an open source distributed caching system. I know of some guys who have used this very successfully on a .net project with a major UK company.
- Velocity – http://www.microsoft.com/downloads/details.aspx?FamilyId=B24C3708-EEFF-4055-A867-19B5851E7CD2&displaylang=en
Velocity is an initiative at Microsoft at present to create a distributed in memory caching platform. I feel that as this evolves it is important to keep an eye on this as it will in the future be likely to become the best approach to this.
These distributed caching systems offer the benefit of taking the memory usage out of your process, but offer fast access to the data via their API. Most of these products also offer high availability and synchronisation across a group of caches when you distribute them across your server group. I have in particular looked at NCache for this example and it is setup as a windows service which you would deploy on each BizTalk box. These services would then be configured to work as a cluster meaning they would synchronise themselves when changes were made.
Pro’s
- Out of process caching offers still fast access to the cached data, but removes the likely hood that the cache might affect BizTalk performance
- These caches are designed for high performance such as NCache which is intended for high performance customer facing ASP.net applications
- They can be integrated with caching frameworks such as Enterprise Library (NCache comes with this out of the box)
- NCache supports cache dependencies and expiration
Con’s
- Again requires work and management of additional components. I think NCache (the buy version) offers a better management and operations
- Potentially brings up the 3rd party or open source debate around which cache system to use
Summary
Hopefully this article has highlighted the many options available when you are considering a caching solution to support your BizTalk implementation. There are many considerations which can be made and there isn’t always a one size fits all rule like in most design decisions. I think some of the things that stand out from this discussion are that most of the approaches above always end up using a custom database to manage the reference data. I think in a future post I will look at how to combine some of the approaches discussed here with the BizTalk Cross Referencing features to produce a fairly simple yet effective combination of all of the approaches.
by community-syndication | Sep 21, 2008 | BizTalk Community Blogs via Syndication
Also take a minute to read Johan's blog post about best practices when building pipeline components.