VirtualBox and 100% CPU usage – the SuperFetch effect

I have a Dell laptop with 8GB RAM and a dual core processor running Windows 7.I use my machine quite intensively.Quite often I run virtual images under Oracle’s VirtualBox.I’m currently using an image that contains an installation of the BizTalk Server 2010 beta.
For months, I found that every time I fired up images under VirtualBox, I would run into problems.I might be fine for a while, but then, at some point, problems would arise.The machine would become totally unresponsive for a few minutes.Task manager would report 100% CPU usage.After a while, things would revert to normal.A few minutes later, I would have to endure another period of unresponsiveness. This would continue for an hour or two, and then everything would revert to normal.
The same problem arose on my machine every Wednesday lunchtime, regardless of the use of VirtualBox.My machine is a company laptop.It has Forefront installed and scheduled to run a full scan at this time.I can’t change the schedule because of policies applied to the machine, although our systems people assure me they have been careful to ensure that Forefront excludes very large files such as virtual images from the scan.
I’ve lived with this issue for a long time.Normally, I would hope to be able to spot a rogue Windows process that is hogging all the processor cycles.However, in this case, there was never any indication of any process causing problems.The combined count of all CPU usage of all processes, including processes belonging to other users, was always substantially below 100%.This was true for processes on the host as well as those running under VirtualBox.Another strange thing was that, over many weeks, I kept on seeing the emergence of a pattern (not counting the Wednesday lunchtime issue, which had an obvious link to Forefront). I might have three days when the problem occurred at the same time each day.Then the pattern would always disappear and some different pattern would later emerge.
Of course, I used various tools to try to track down the problem.With no rogue processes, my best guess was that the problem must be occurring at a lower level, perhaps with some badly behaving driver.I found no hint of any problems with drivers, though.I’ve seen rather similar behaviour in the past due to interrupt conflicts, but again, no sign of any issues.I spent quite some time with SysInternals Process Explorer trying to track down the problem, but to no avail.
Then, a few days ago, I got my first solid clue as to what was happening. It was Process Explorer that helped.My attention was drawn to an instance of the Windows service host that I could see was using up a few % of CPU cycles.I opened it up and had a look. One of the nice things about Process Explorer is that it provides a graph of CPU usage at the process level. The graph I saw grabbed my full attention. There, before my eyes, was a lovely trace showing clearly that, at just the same time my machine had gone into 100% CPU usage, this process had suddenly started using a few % of CPU cycles.At the moment the CPU usage dropped back down, so did the graph.
Process Explorer allows you to see all the services that are running in an instance of the service host. I set things up and waited for the problem to re-emerge.Sure enough, after a minute or two, CPU usage rocketed sky high.I had previously discovered that if I paused VirtualBox, the CPU usage would drop back to about 80%.The machine was still very sluggish, but could be used. So, I paused VirtualBox, waiting an eternity for the mouse click event to be processed, and then got to work. As quickly as I could, I worked through the list of services.The host was running exactly ten Windows services.I stopped the Desktop Windows Management Session manager [UxSms] – bang went my Aero interface – I stopped the Distributed Link Tracking Client [TrkWks] service – no change – I killed the Human Interface Device Access [hidserv] – etc., etc.At last, on the seventh service, I stopped SuperFetch and, after taking ages to close, everything burst into life.
I have been running VirtualBox constantly since then, over several days. I have yet to see any reoccurrence of the issue.Last Wednesday, for the first time in a very long time, Forefront completed a full scan without issues. Wonderful.
Is SuperFetch at fault?I can’t say. Is it just a bad installation of Windows 7. Maybe.Perhaps VirtualBox is the true culprit.That’s possible. I have no idea. All I know is that my productivity is now significantly higher after switching SuperFetch off.To switch it off, I simply opened the ‘Services’ administration managament console anddisabled the SuperFetch Windows service.
I discussed my experience on the BizTalk Gurus newsgroup and two other people responded that they had been having similar issues. They have both switched SuperFetch off.One, Randal van Splunteren, got back to me to say that his VirtualBox problems were significantly reduced by switching SuperFetch off.However, as he pointed out, there was still a fairly high on-going CPU usage when VirtualBox is running (45-50% on my box).As I understand it, VirtualBox always soaks up CPU cycles, even when the image is idle, due to timing interrupts which I presume has something to do with synchronising the virtual image to the actual hardware.This is to be expected, but the CPU usage did seem too high for comfort. Randal investigated further and came up with a further improvement.Both he and I had configured our images to use two logical CPUs.You can control this on the Processor tab under Settings/System for a specific image.Reducing this to 1 significantly reduced the CPU usage.
Randal reports that this only really helped once he had reverted back to an older version of VirtualBox.Under Oracle’s ownership, we are currently at version 3.2.8. Randal recommends ditching this version in favour of version 3.1.8 which belongs to the Sun era.I experimented on my machine with both versions. My experience was that setting the number of logical processors to 1 made a significant different on both versions, but the effect was a little greater under the older version.I get about 20% CPU usage under 3.2.8, but perhaps only 15% under 3.1.8. Randal has decided to use the older version.I’ve decided to stick with the current version, at least for now.
So, a combination of disabling SuperFetch and configuring a single logical processor has made all the difference.If you have also been having problems running VirtualBox, then there may also be merit in reverting to an older version.

Installing the ESB Toolkit v2.0 on Windows 7 Error: Operation Not Completed

Here’s an interesting one: I was installing the ESB Toolkit 2.0 on a Win 7 x86 machine the other day, and during install got an error from Visual Studio saying “The operation could not be completed.”:

If you choose OK, the ESB Installer finishes with a success message, but then the Itinerary Designer doesn’t work in Visual Studio (you get an error about a missing assembly when you try and add a new itinerary to a project).

There was nothing…

Workflows and no persist zones

There are times when a workflow can’t be persisted safely using a SqlWorkflowInstanceStore. The reason isn’t so much saving the state of  a workflow to disk, that could be done at any time, but the result when a workflow would be reloaded from disk in that state.

An easy example is a workflow handling a WCF request with a Receive and SendReply activity pair. Suppose you would save the workflow state after the message had been received but before the response had been send. No problem there. Now suppose the workflow is aborted just after the response is send and later the saved state is reloaded. The first action of the reloaded workflow would be to send the response again. But where? The connection is long gone and the client has seen the original response and is happy. Clearly this would fault the workflow again and for that reason the workflow is in a no persist zone starting at the Receive and ending at the SendReply activity.

But wait. The SendReply activity has a property named PersistBeforeSend, surely that means it will be persisted before the reply is send? Actually it doesn’t and it persists after the response has been send. So the property name is wrong? Not quite, although I believe it is poorly named, as the Persist activity is actually scheduled just before the response is send. But due to the asynchronous nature of workflow execution it doesn’t execute until after the response has actually been send and we are at a point where it would be save to reload the workflow.

So what happens of you try to persist a workflow in a no persist zone? You will get an exception with the following message:

Persist activities cannot be contained within no persistence blocks.

In this case all I did was drop a Persist activity between a Receive and SendReply activity pair, I didn’t even add or configure the SqlWorkflowInstanceStore itself.

 

Checking for a no persist zone

So how can we check if we are currently in a no persist zone? It turns out the NativeActivityContext has a property called IsInNoPersistScope that will tell us that, unfortunately the property is internal so we can’t use it [:(].

Turns out that duplicating this behavior is quite easy though. Whenever a no persist zone is started a NoPersistProperty is created and stored in the NativeActivityContext Properties collection. The NoPersistProperty is also internal but that is no problem as all we need to do if check if there is a object stored under the name "System.Activities.NoPersistProperty", something we can do with the following activity:

public class CheckForNoPersistZone : NativeActivity
{
    public OutArgument<bool> IsInNoPersistScope { get; set; }
 
    protected override void Execute(NativeActivityContext context)
    {
        var prop = context.Properties.Find("System.Activities.NoPersistProperty");
        IsInNoPersistScope.Set(context, prop != null);
    }
}

 

Creating our own no persist zones

Sometimes it might be required to stop a workflow from persisting and we can do so by creating our own no persist zone. There is no standard activity to do so but creating a pair of activities to do so isn’t hard. The basic building blocks are a NoPersistHandle and calling Enter() and Exit() on it to start and end the no persist zone. This could easily be done through a composite activity or through two separate activities. In the code below I have chosen for the approach with a single composite activities.

public class NoPersistZone : NativeActivity
{
    private Variable<NoPersistHandle> NoPersistHandle { get; set; }
    
    [RequiredArgument]
    public Activity Body { get; set; }
 
    protected override void CacheMetadata(NativeActivityMetadata metadata)
    {
        NoPersistHandle = new Variable<NoPersistHandle>();
        metadata.AddImplementationVariable(NoPersistHandle);
 
        base.CacheMetadata(metadata);
    }
 
    protected override void Execute(NativeActivityContext context)
    {
        var noPersistHandle = NoPersistHandle.Get(context);
        noPersistHandle.Enter(context);
        context.ScheduleActivity(Body, OnCompleted);
    }
 
    private void OnCompleted(NativeActivityContext context, ActivityInstance completedInstance)
    {
        var noPersistHandle = NoPersistHandle.Get(context);
        noPersistHandle.Exit(context);
    }
}

 

Another nice activity to have for a generic toolbox.

 

Enjoy!

www.TheProblemSolver.nl

Wiki.WindowsWorkflowFoundation.eu

BizTalk 2010: Changed features and tools

 

I have been reading through BizTalk 2010 Technical Review and noticed that a few things have been changed in BizTalk 2010 compared to current 2009 version. SQL Adapter has been removed and one should now turn to WCF-SQL Adapter that is a part of BizTalk Adapter Pack 2.0. Although in my BizTalk Server 2010 Beta environment on VPC, I still see the SQL Adapter.

In forums I notice still a lot of people use the standard out-of-the box SQL Adapter and they will have to make the shift towards WCF-SQL Adapter in future.For anyone who likes wants to learn more, the article on Packt Publishing by Richard Seroter called New SOA Capabilities in BizTalk Server 2009: WCF SQL Server Adapter is a good starting point and as a follow up my previous post WCF-SQL Table Operations discusses how to set it up the adapter and how CRUD operations on tables in SQL Server work. 

Another change is that BizTalk Explorer tool in Visual Studio 2010 is not available for BizTalk Server 2010. Any administrative task you as an administrator of developer wants to do with a Graphical UI is BizTalk Server Administration Console. I think a lot of people stop using this feature in 2006 as Administrative Console came available.

In the Administration console you will also find the BizTalk Server Settings Dashboard, and this is a new feature. Performance settings previously managed through the registry and other locations, are now been centralized in this dashboard. To manage settings (i.e. performance) for the group, hosts, and host instances, you can now use this dashboard. These settings control throttling, thresholds and other runtime parameters in a single location and all settings can be exported and imported to move them between environments. Thiago has written an excellent post on this called BizTalk Server 2010 Beta: Settings Dashboard and Latency control per host.

Cheers.

Technorati: biztalk biztalk server 2010

Forcing HTML to recognize Carriage Return Line Feeds from C# method

I had an XSLT style sheet that used a C# method to return a multi line string. I wanted it to include the HTML <BR></BR> tag ad specific places within the string but not as literals. After some searching around I found that instead of trying to get the <BR></BR> tag to not be literal, I could simply use the <pre></pre> tag within my HTML and it would force it to recognize the “/r/n” returned from the C# method.

Sum of Product using XSLT and .NET

I needed to find the total purchase order amount given the quantity and line item price on an X12 850 using XSLT and .NET to transform XML into HTML.

In my case:

PO102 = Quantity

PO104 = Price

Add the following to the stylesheet declaration:

xmlns:msxsl=”urn:schemas-microsoft-com:xslt”

for example:

<xsl:stylesheet version=”1.0″ xmlns:xsl=”http://www.w3.org/1999/XSL/Transform” xmlns:ns0=”http://schemas.microsoft.com/BizTalk/EDI/X12/2006″ xmlns:msxsl=”urn:schemas-microsoft-com:xslt” xmlns:var=”http://schemas.microsoft.com/BizTalk/2003/var” exclude-result-prefixes=”msxsl var userCSharp” xmlns:userCSharp=”http://schemas.microsoft.com/BizTalk/2003/userCSharp>”>

In your XSLT use the following code, modifying the referenced node and element within your XML

<xsl:variable name=”tmpTotal”>
<total_amount>
<xsl:for-each select=”ns0:PO1Loop1″>
<item>
<xsl:value-of select=”ns0:PO1/PO102 * ns0:PO1/PO104“/>
</item>
</xsl:for-each>
</total_amount>
</xsl:variable>
<total>
<!–<xsl:variable name=”myTotal” select=”msxsl:node-set($tmpTotal)”/>–>
<xsl:variable name=”myTotal” select=”$tmpTotal”/>
<xsl:value-of select=”sum(msxsl:node-set($myTotal)/total_amount/item)” />
</total>

Using MSBuild 4.0 web.config Transformation to Generate Any Config File

[Source: http://geekswithblogs.net/EltonStoneman]

Web.config transformation is a simple and powerful inclusion in .NET 4.0 for generating configuration files for different environments. It’s a templated match-and replace, and you can put together a homegrown alternative with T4 and some scripting, but the integrated experience is better. It’s limited to web.config in Visual Studio, but with a simple MSBuild target you can leverage it for any config file.

The transform is based on your VS.NET solution configurations, so you can start by replacing the standard “Debug” and “Release” builds with custom configurations to suit your environment. Then run “Add config transformations” by right-clicking the web.config file and VS will generate stub files for each type of build:

The original Web.config file is the source file, and the .DEV, .TEST and .PROD versions are the transform files which contain the overrides needed for each environment. So your Web.config may define a database connection:

Normal
0

false
false
false

EN-US
X-NONE
X-NONE

/* Style Definitions */
table.MsoNormalTable
{mso-style-name:”Table Normal”;
mso-tstyle-rowband-size:0;
mso-tstyle-colband-size:0;
mso-style-noshow:yes;
mso-style-priority:99;
mso-style-parent:””;
mso-padding-alt:0cm 5.4pt 0cm 5.4pt;
mso-para-margin-top:0cm;
mso-para-margin-right:0cm;
mso-para-margin-bottom:10.0pt;
mso-para-margin-left:0cm;
mso-pagination:widow-orphan;
font-size:11.0pt;
mso-bidi-font-size:10.0pt;
font-family:”Calibri”,”sans-serif”;
mso-ascii-font-family:Calibri;
mso-ascii-theme-font:minor-latin;
mso-hansi-font-family:Calibri;
mso-hansi-theme-font:minor-latin;
mso-bidi-font-family:”Times New Roman”;
mso-bidi-theme-font:minor-bidi;}

<connectionStrings>

<add name=db1

connectionString=” “

providerName=System.Data.SqlClient />

</connectionStrings>

– and in Web.DEV.config you provide the connection string for the development environment and specify a matching pattern. In this case specify to match on the name attribute of the element, and all the other attributes you provide will replace those in the source (there are more complex transformations available):

Normal
0

false
false
false

EN-US
X-NONE
X-NONE

/* Style Definitions */
table.MsoNormalTable
{mso-style-name:”Table Normal”;
mso-tstyle-rowband-size:0;
mso-tstyle-colband-size:0;
mso-style-noshow:yes;
mso-style-priority:99;
mso-style-parent:””;
mso-padding-alt:0cm 5.4pt 0cm 5.4pt;
mso-para-margin-top:0cm;
mso-para-margin-right:0cm;
mso-para-margin-bottom:10.0pt;
mso-para-margin-left:0cm;
mso-pagination:widow-orphan;
font-size:11.0pt;
mso-bidi-font-size:10.0pt;
font-family:”Calibri”,”sans-serif”;
mso-ascii-font-family:Calibri;
mso-ascii-theme-font:minor-latin;
mso-hansi-font-family:Calibri;
mso-hansi-theme-font:minor-latin;
mso-bidi-font-family:”Times New Roman”;
mso-bidi-theme-font:minor-bidi;}

<connectionStrings>

<add name=db1

connectionString=Data Source=.\SQLEXPRESS;Initiali Catalog=xyz;Integrated Security=SSPI;

xdt:Transform=SetAttributes xdt:Locator=Match(name)/>

</connectionStrings>

Switch to the DEV build type, right-click the web project and select “Build Deployment Package” to generate a config file for dev which contains the transformed node with the connectionString attribute from the transform file and the providerName attribute from the source file:

Normal
0

false
false
false

EN-US
X-NONE
X-NONE

/* Style Definitions */
table.MsoNormalTable
{mso-style-name:”Table Normal”;
mso-tstyle-rowband-size:0;
mso-tstyle-colband-size:0;
mso-style-noshow:yes;
mso-style-priority:99;
mso-style-parent:””;
mso-padding-alt:0cm 5.4pt 0cm 5.4pt;
mso-para-margin-top:0cm;
mso-para-margin-right:0cm;
mso-para-margin-bottom:10.0pt;
mso-para-margin-left:0cm;
mso-pagination:widow-orphan;
font-size:11.0pt;
mso-bidi-font-size:10.0pt;
font-family:”Calibri”,”sans-serif”;
mso-ascii-font-family:Calibri;
mso-ascii-theme-font:minor-latin;
mso-hansi-font-family:Calibri;
mso-hansi-theme-font:minor-latin;
mso-bidi-font-family:”Times New Roman”;
mso-bidi-theme-font:minor-bidi;}

<connectionStrings>

<add name=db1

connectionString=Data Source=.\SQLEXPRESS;Initiali Catalog=xyz;Integrated Security=SSPI;

providerName=System.Data.SqlClient />

</connectionStrings>

All straightforward, and the deployment package VS builds is used with the WebDeploy tool, and makes it all very simple. The actual config file will be created in obj\<Configuration>\TransformWebConfig\transformed. You can get the same result from MSBuild by building the solution file with the property DeployOnBuild=True.

The actual transform logic is all in the TransformXml MSBuild task from Microsoft.Web.Publishing.Tasks.dll. So you can utilise the same functionality for generating any config file with a simple MSBuild script:

Normal
0

false
false
false

EN-US
X-NONE
X-NONE

/* Style Definitions */
table.MsoNormalTable
{mso-style-name:”Table Normal”;
mso-tstyle-rowband-size:0;
mso-tstyle-colband-size:0;
mso-style-noshow:yes;
mso-style-priority:99;
mso-style-parent:””;
mso-padding-alt:0cm 5.4pt 0cm 5.4pt;
mso-para-margin-top:0cm;
mso-para-margin-right:0cm;
mso-para-margin-bottom:10.0pt;
mso-para-margin-left:0cm;
mso-pagination:widow-orphan;
font-size:11.0pt;
mso-bidi-font-size:10.0pt;
font-family:”Calibri”,”sans-serif”;
mso-ascii-font-family:Calibri;
mso-ascii-theme-font:minor-latin;
mso-hansi-font-family:Calibri;
mso-hansi-theme-font:minor-latin;
mso-bidi-font-family:”Times New Roman”;
mso-bidi-theme-font:minor-bidi;}

<UsingTask TaskName=TransformXml AssemblyFile=bin\Microsoft.Web.Publishing.Tasks.dll/>

<Target Name=GenerateConfigs>

<MakeDir Directories=$(BuildOutput) Condition=!Exists(‘$(BuildOutput)’)/>

<TransformXml Source=BTSNTSvc.exe.config

Transform=BTSNTSvc.exe.$(Configuration).config

Destination=$(BuildOutput)\BTSNTSvc.exe.config/>

</Target>

You’ll have to manually create the source and transform files for each environment, but you can maintain the naming convention so the pattern is consistent across your Web projects and other config targets. One thing the transform doesn’t do is render the runtime value of MSBuild properties – so if you want to include the build version number in an attribute of the config file, then you’ll need a custom step before calling TransformXml to parse the property values.

For application config files you can have this step in your build prior to creating MSIs, so that the MSI has the correct config values when deployed. For external configs – if you need to populate say machine.config or BTSNTSvc.exe.config – you can create the configs in the build and have a manual deploy step to overwrite the files, or if you’re brave you could create an MSI which just contains the config files and overwrites the targets on install.

Note that Microsoft.Web.Publishing.Tasks.dll is part of VS 2010 and not part of MSBuild (default install location: C:\Program Files\MSBuild\Microsoft\VisualStudio\v10.0\Web), so you will need to be licenced to use the assembly on your build server.

This is an excellent way to centralise all your environment variables in source control along with the project code, and removes the risk of manual config updates as part of deployment.

TSQL count all rows in a database

I needed to count all of the rows in an entire database.

I looked around and came up with a pretty easy way to do this.

Create this stored procedure and when you run it, it will display it for you.

create proc AllTableCount AS BEGIN /*Create temp table to populate the row counts for all of the tables*/ CREATE TABLE #TableRowCount( TableName sysname, [RowCount] int) /*Now actually get the counts*/ EXEC sp_MSforeachtable 'INSERT #TableRowCount (TableName,[RowCount]) SELECT ''?'',Count(*) from ?' /*Finally: sum up all of the counts and show it*/ Select SUM([RowCount]) as [Total Number Of Rows From Database] from #TableRowCount END