Contest: Win a e-copy of the book BizTalk 2010 Line of Business Systems Integration

Contest: Win a e-copy of the book BizTalk 2010 Line of Business Systems Integration

As I mentioned earlier, I have 3 e-copies of the new BizTalk 2010: Line of Business Systems Integration to give away, courtesy of Packet Publishing. This book was written by Kent Weare, Carl Darski, Thiago Almeida, Sergei Moukhnitski and Richard Seroter published by Packet Publishing. You can read more about this book here on Packt’s […]
Blog Post by: Sandro Pereira

Sweden Windows Azure Group (SWAG) meeting on Thursday 27th October in Stockholm, "Extending to the Cloud",

I’ll be presenting a session on “Extending to the Cloud” at the Sweden Windows Azure User Group next week.
Extending to the Cloud
Extending to the cloud involves developing hybrid applications that leverage the capabilities of cloud based platforms. Many of the Windows Azure solutions developed today involve extending the capabilities of existing applications and infrastructure to leverage the capabilities of Azure.
These hybrid applications allow developers to combine the best of both worlds. Existing IT infrastructure can be utilized and the capabilities of cloud platforms used to extend the applications, providing enhanced functionality at minimal cost.
This demo centric session will highlight how the hosting, compute and storage capabilities in the Window Azure platform can be used to extend on premise applications to the cloud. The rendering of 3D animations and cloud based source control will be used as case studies.
Free registration is here.

5 hours one-to-one chat with Charles Young

I’m not going to bore you readers with who Charles Young is, if you are following my blog and you are into BizTalk you’ll know who Charles Young is

From BizTalk360 perspective I wanted to give away few copies of "Microsoft BizTalk Server Unleashed" book to people in the community. To make it special I wanted the book to be signed by some of the authors. Unfortunately except Charles everyone else lives outside UK. If I had the idea before I could have got it signed by Jan Eliasen few weeks ago, when I was in Denmark.

Anyway, this is how it all started, we exchanged few emails, and decided on the date when we are going to meet up. Charles was kind enough to come to our place, I picked him up from the station at 11:30am this morning and went to a coffee shop near by. I was expecting may be an hour or bit more, but when we finished our conversation I looked at the watch it was 16:15.

These things don’t happen often, you sit down with someone like Charles who is really excited and enthusiastic about technology and have a one-to-one chat for nearly 5 hours. I have spoken to him in the past in MVP summits, but nothing to this extend. We had great conversation around so many things, BizTalk, Windows Azure, Amazon, Google, a new term Cloud Service Vendor (CSV), what he thinks is missing on the Azure for CSV’s. Interesting chat about iCloud and the whole rumours around their use of both Windows azure (here and here ) and Amazon S3 for the service.

Quite a bit of talk was around BizTalk and the future directions, problems we face on the enterprise space, and how BizTalk penetrated into that space as a matured product. We discussed a bit about the book and how he structured his chapters in the book.

He thinks the subscription engine we got in BizTalk is just a flavour of rules engine. If you think about it, it’s very true, all it does is evaluates certain rules and matches the subscribers. He also got this great vision of building solutions based on policy driven layered approach. Ex: Subscriptions are bottom layer, static ports are layer above it, dynamic ports are one layer above static ports where you need to provide additional information like address.

Conversation went deep into RulesFest and how people are using rules to solve real world problems that requires artificial intelligence and rule matching like soil analysis for farming.

We also discussed quite a bit about J#, and how death of J# closed the doors for .NET community to have access to the wide matured open source projects like Drools

The interesting part of the conversation was how a failed POC project around 2005, made him interested in the RETE algorithm and rules engine. Only after the POC he realized he was trying to build a forward chaining inference engine with BizTalk orchestrations. If he had known enough about rules engine that time he could have build a much sleeker solution to the problem.

Of course we did have a chat about BizTalk360 and one thing I understood was not everybody got a clear idea of real benefit of BizTalk360 and what problems it’s trying to solve. He was honest to admit, it didn’t strike him first time until a co-worker pointed out the real use case for BizTalk360.  I need to do some work to improve here.

We will soon announce how we are going to give all the signed copies of the book "Microsoft BizTalk Server Unleashed 2010", keep tuned. Follow us on twitter (http://twitter.com/biztalk360) or on facebook (http://facebook.com/biztalk360) to get the latest updates.

Nandri!

Saravana

Simpler, Smoother, Smarter

We (Enfo) had an internal image contest on the above theme. Unfortunately the images that you could enter was supposed to be of yourself at arms length showing your face. Since I couldn’t quite imagine how I could make myself express those three things by simple face expression I took another picture. And since it wont make the image contest I am instead posting it here.

I am no graphical designer, though at times I could fool a few, but hail Powerpoint as far as my image processing skills go 😉

Blog Post by: Johan Hedberg

Win a e-copy of the book BizTalk 2010 Line of Business Systems Integration

Win a e-copy of the book BizTalk 2010 Line of Business Systems Integration

I have 3 e-copies of the new BizTalk 2010: Line of Business Systems Integration to be raffled away, courtesy of Packet Publishing. To be fair and impartial to all my readers, since they are spread all over the world with different time zones, I decided to make a pre-announcement of the contest so everyone has […]
Blog Post by: Sandro Pereira

End-to-end WCF code coverage with PowerShell

I’m a big fan of automated testing which actually proves something. Unit tests may prove all the components in a solution work independently, but that doesn’t mean you have a working solution. Integration tests give you confidence in the whole solution, but the cost of having a test suite with a lot of external dependencies is the risk of false build failures and long-running test suites.

To get something in between, we have acceptance tests written in SpecFlow which run tests at the highest layer – targeting WCFservices which are the external entry point to our solution. Michael Stephenson has blogged and given sessions on the power of BDDfor unifying a project team’s understanding of a feature, and being able to get code coverage based on features proves the solution works as expected through all scenarios. Acceptance tests use stubs for any external dependencies, so they run quickly and allow us to confidently assert that our solution works, provided the dependencies work as expected.

Getting coverage for a whole stack of .NET code running beneath the WCFlayer is trickier than you might expect. There’s no VS or MSBuild functionality for it, and you need todig down tovsinstr and vspercmd.I’ve made it all as simple as possible with a generic PowerShell script you can call locally during development, and which runs as part of the MSBuild scripts on the server.

The script is on poshcode here: PowerShell script for running WCF code coverage against IIS.

Technically the script can be used to get coverage on any process which is hosted in IIS. The bulk of the work is done in functions in the script, the main block requires you to specify values for your own solution:


#————–

# Script begins

#————–

#set variables:

$solutionFriendlyName

=

XYZ

$wakeUpServiceUrl

=


‘http://localhost/x.y.z/Service.svc



$appPoolName

=

ap_XYZ

$appPoolIdentity

=

domain\svc_user

$websiteBinDirectory

=

C:\websites\xyz\x.y.z.Services\bin

$coverageOutputPath

=

“Test.coverage”




#instrument assemblies:

# – instrument assembly so ALL namespaces are included in coverage

instrument

“x.y.z.Services.dll”


# – instrument assembly so anything from the Ignore1 and Ignore2 namespaces are excluded from coverage

instrument

“x.y.z.Core.dll”

‘x.y.z.Core.Ignore1.*’
,
‘x.y.z.Core.Ignore2.*’




#instrument W3WP, run tests & export results:

start-instrumentation

&

‘C:\Windows\Microsoft.NET\Framework\v4.0.30319\msbuild.exe’

Build.proj

/t:$msBuildTarget

/p:ConfigurationName=$configurationName

stop-instrumentation

export-coverage

#————

# Script ends

#————


The sections highlighted in blue above need to be replaced with your own values:




  • solutionFriendlyName – just a friendly name which is used in logging output





  • wakeUpServiceUrl – any URL which is part of your solution. The script hits the URLto start an app pool which is can instrument for code coverage. Can be anything which fires up the worker process – .svc, .aspx etc.;





  • appPoolName – the name of the app pool your solution runs under. Ideally have a dedicated app pool as the covergae run stops and starts it, so if it’s dedicated to the solution you won’t impact anything else;





  • appPoolIdentity – the service account the app pool runs under. Used in setting up instrumentation;





  • websiteBinDirectory – path to the binaries you want to be instrumented, where the IIS site is running;





  • coverageOutputPath – specify what the coverage file will be called. Should have the .coverage extension;




  • instrument – add a line for every assembly you want to instrument. If the assembly contains just your code, you only need to name the assembly. If it contains generated code you don’t want included (e.g. service reference or EF model code), specify each namespace to exclude after the assembly name. Syntax is important here – each namespace should be fully-specified up to the wildcard, quoted in single quotes, and multiple namespaces are comma-separated.

The script has some expectations which mean hardcoded values in the functions, so if your environment is different you’ll need to look at those:

  • running on Windows Server (expects to instrument W3WP.exe)
  • VS2010 installed at default path
  • .NET4 installed

In our build scripts, we have one target to run the tests, and another to run the tests with coverage. The PowerShell script calls back into the original MSBuild script to run the unit tests, so to incorporate the script into your build you need a target like this:

<
Target
Name
=
AcceptanceTestsWithCoverage>
<
Exec
ContinueOnError
=
trueCommand=powershell.exe “$(MSBuildProjectDirectory)\RunCoverage.ps1 -msBuildTargetAcceptanceTests -configurationName $(ConfigurationName)”WorkingDirectory=$(MSBuildProjectDirectory) />
</
Target
>

– passing the script the name of the MSBuild target to run to execute the tests.

The output from the script is a .coverage file which you can load into VS(note that if you rebuild after running coverage, the assemblies will not be instrumented and the coverage won’t load correctly), and an XML export of the coverage (which you can roll up in your build scripts, or load into VS).

Leonard Lobel Awarded Microsoft MVP for SQL Server!

This month, Microsoft awarded 143 exceptional technical community leaders with the Most Valuable Professional (MVP) title and re-awarded 764 MVPs worldwide. Tallan is thrilled to announce that Lenni Lobel has been recognized as an MVP in SQL Server!
According to the Microsoft MVP Award Program Blog, there are more than 100 million social and technical community […]
Blog Post by: milyevsky

BizTalk User Group Sweden : My debut as International Speaker

Next week a new BizTalk User Group Meeting in Sweden is scheduled with two sessions:

  • A Lap Around BizTalk Adapter Pack
  • A Lap Around BizTalk360

The first session will be done by me and it will be first time I will do a session in English in another country. The other session is done by Saravana on BizTalk360. It will be second time both me and Saravana will share the same stage. Last time was June this year for the Dutch BizTalk User Group.

There are still seats left for this event, so if you want you can attend; it is in Stockholm and I expect many if it not all from Sweden. My session will be talk about BizTalk Adapter Pack, its evolution how it became the pack that is shipped along with BizTalk 2010, its origin, alignment with Microsoft Platform, Cloud and an integration scenario with Oracle. Saravana will take you through his BizTalk360 product showing a lot of its capabilities and how you can leverage them in a BizTalk production environment.

For those who will attend I am looking forward to meet you and present my story on the BizTalk Adapter Pack. Thanks Johan, Mikael and Microsoft Sweden for inviting me and hosting the event.

Cheers!

BizTalk360 – Knowledge base repository

One of the major enhancements we introduced in BizTalk360 2.5 release is Knowledge base repository capabilities. The basic idea behind this is, you keep accumulating KB articles as and when you fix issues in the environment. Over a period you should have wealth of information readily available within the tool (BizTalk360) you are using to support the environment. This will greatly reduce the time it will take to fix the repeating problems. Some of the repeating problems in the environment are inevitable, examples include:

  • Environment configuration,
  • Data issue (junk characters, encoding issues etc)

BizTalk360 KB repository makes it seamless. It comes with following features

  • Attach KB article to suspended instances error codes
  • Attach KB article to eventId
  • Centrally manage all the KB articles
  • Assign permission, who can edit the article

You can read the complete document here Understand Knowledge Base Repository. We also produced couple of videos to show you how the whole thing works.

Knowledgebase Repository – Introduction ( 3 minutes)

Knowledgebase – Central management and user permission (2 minutes)

Nandri!

Saravana Kumar

http://www.biztalk360.com | http://support.biztalk360.com
@biztalk360 | http://facebook.com/biztalk360  | http://getsatisfaction.com/biztalk360

WCF BAM interceptor and no data

I had a
situation where I were looking at using BAM (Business Activity Monitoring) from
WCF (Windows Communcation Foundation) and I couldn’t find out why I didn’t get
any data into my BAMPrimaryImport database. I didn’t get any errors so I was
really wondering where my error was.

I launched
the SQL Profiler to see if this could help me in my search for why I didn’t get
any data. When the WCF service is launched it calls a stored procedure:

exec
bam_Metadata_GetLatestInterceptorConfiguration @technologyName=N’WCF’,@manifest=N'<namespace>.<interface>, <Assemblyname>,
Version=1.0.0.0, Culture=neutral, PublicKeyToken=null’

This stored
procedure returned zero rows, so that was why I didn’t get any data into my
tables in the BAMPrimaryImport database. I wasn’t sure why yet why this call
didn’t return any data. I looked at the SQL code for the stored procedure and
found that it queried the table “
[bam_Metadata_EventSource]”. In this table it had the information that I was asking for, but the
case of the <Assemblyname> in the manifest column where different than
what I saw were used in the calling of the stored procedure.

So what I
learned from this is to remember the correct spelling and casing in the IC file
map:
  <ic:EventSource Name=xxx Technology =WCF Manifest =<namespace>.<interface>, <assemblyname>,
Version=1.0.0.0, Culture=neutral, PublicKeyToken=null
>
Something
that might be hard to find