Upgrading from SharePoint 2010 Beta 2 to RTM

Upgrading from SharePoint 2010 Beta 2 to RTM

I’ve just spent the past couple of days looking into the ’upgrade’ process and I’ve
come out the other side. Here’s a quick jot down of what I encountered

  1. Environment: Single WFE (Win2K8 R2 x64) and a separate SQL 2008 x64
    ‘backend’. Currently running SP2010 Beta2 with content databases around 13GB
  2. I haven’t uninstalled any of the existing SP2010 beta 2 bits – this is a VM and I
    have a handy backup, so in case of emergency break glass was my plan b.
  3. Launched Setup.bat – up came the intro screen and I selected Install
    PreReqs.
    In my case the prereqs failed their first installed with it grumbling
    about the IIS Web Role not present (but this was an existing SP2010 server, so obviously
    the Web Role was present and correctly running).

    Ran the PreReqs again and all was fine 🙂

  4. Setup.exe – launched the product, provided a license key and we ran
    all the way through no problems. Interestingly thRege installer didn’t mention anything
    like “I found a previous version of SharePoint do you want to ”
  5. Configuration Wizard – this is where my trouble started.
    1. As the Wizard was launching, it appeared not to be able to recognise the
      fact that we were already in a farm. It was like we were running on a machine, where
      the DB Server wasn’t accessible.

      I ’removed’ the Server from the Farm (which were the only options in the Wizard for
      me)

    2. Re-ran the Wizard Take#2, supplied the details at the beginning,
      Server Farm etc etc and sent it on it’s way.
    3. The Wizard ran up to step 3 of 10 (where it configures the Farm DB
      + creates the SharePoint Web Services IIS Site) and then FAILED.
      What I noticed that the FarmDB was created successfully, but the IIS side was failing.
      (The error logs spoke about a socket based error – which I think was unrelated)
    4. Tweaking, rerunning the Wizard didn’t fix thingsso here’s my fix.
    5. I also uninstalled SP2010 RTM a few times and reinstalled, but same error
  6. The fix –
    1. Delete the \14 hive
    2. Delete the Registry Key (and all under it) HKLM\Software\Microsoft\Shared
      Tools\Web Server Extensions
    3. Uninstall/reinstall the IIS Web Role on the Server
  7. Repair the SharePoint install from rerunning Setup.exe –
    repair option.
    (you might be able to leave this step out, but I deleted the
    Web Server Extensions key after I installed SP2010, so I needed it to be rewritten)
  8. Run Config Wizard
  9. Upgrade any Content DBs you want to mount – through stsadm
    -o addcontentdb
    or Mount-ContentDB (powershell)
  10. >

    I’m sure you’ll be able to shorten this list of steps when you upgrade, but I’ve got
    to get on and configure this environment.

    Have fun,

    Mick.

SharePoint 2010 RTM and BCS Permissions

SharePoint 2010 RTM and BCS Permissions

 

If you are using SharePoint 2010 RTM and get the following error:

Access denied by Business Data Connectivity.

bdcerror

It likely is because by default on installation the BCS service doesn’t have any permissions
enabled at all.

If you go to your central administration site and manage the BCS service and look
at Metadata Store permissions you’ll see that it is empty

bdcempty

Obviously in a production environment you want to be very deliberate about setting
the permissions, but when you are developing you just want your code to work 🙂

Here’s a PowerShell script that will set the permissions on the Metadata store (and
all BCS models deployed after you run the script) to all windows users.  Again,
not something I recommend for a production machine – but something that is fine for
your development machine.

 1: $adminURL="HKLM:\SOFTWARE\Microsoft\Shared
Tools\Web Server Extensions\14.0\WSS\"

 2: $key
= Get-Item $adminURL 

 3: $adminurl
= $key.GetValue(">CentralAdministrationURL")

 4: Write-Host
">Central admin site "+ $adminurl

 5: Add-pssnapin
Microsoft.SharePoint.PowerShell -ErrorAction SilentlyContinue

 6:  

 7: Write-Host
Updating BCS persmissions 

 8: $obj
= Get-SPBusinessDataCatalogMetadataObject -BdcObjectType Catalog -ServiceContext $adminurl

 9: $right
= (1+2+0x10+4)

 10: $claim
= New-SPClaimsPrincipal -Identity 'c:0!.s|windows' -IdentityType
EncodedClaim

 11: Grant-SPBusinessDataCatalogMetadataObject
-Identity $obj -Principal $claim -Right $right

 12: Copy-SPBusinessDataCatalogAclToChildren
-MetadataObject $obj

 13: Write-Host
Complete



Check out my new book on REST.

Stray Text in BTARN Messages and BTARN Service Validation Failures

Stray Text in BTARN Messages and BTARN Service Validation Failures

A recent bug found in BTARN processing can cause stray text to appear in response messages. The issue is a temporary variable not being cleared. Here is a sample from a recent case.


<ActionIdentity>
     <GlobalBusinessActionCode>Formally confirms the status of line item(s) in a Purchase Order. A Purchase Order line item may have one of the following states: accepted, rejected, or pending.</GlobalBusinessActionCode>
     <InstanceIdentifier>md.o1loqohc.efs41f.</InstanceIdentifier>
     <VersionIdentifier>V02.00</VersionIdentifier>
</ActionIdentity>


The Business Document Description is written to a temporary variable. It is only overwritten if “/ServiceHeader/ProcessControl/TransactionControl/ActionControl/ActionIdentity/description/FreeFormText” is populated. Since this is an optional value the temporary variable was not properly cleared leading to the stray text. A hot fix is available to rid the world of this menace.



A replacement S/MIME component is now available to remedy those cases with unexplained service validation errors. The error appears to be random, however in high volume processing the right circumstances happen frequently enough to generate a couple of failures a day. The “exsmime.dll” has to be updated to remedy the problem. The fix was to be included in BizTalk R2 SP1. Early releases of SP1 may not have the latest version, 8.2.254.0. The problem also exists in BizTalk 09.

BizTalk Benchmark Wizard – New release

BizTalk Benchmark Wizard – New release

The Benchmark Wizard was release earlier this year, and even though we got lots of good feedback we’ve also got requests for some changes.

If you haven’t yet heard about this tool, the purpose of it is to verify your BizTalk environment performs as expected. There are two different scenarios you can run, a Messaging and an Orchestration scenario. Each of the scenarios has been tested on various environments and configuration. The result of these tests has provided the tool with a set of KPI’s, which your test result will be benchmarked against. For more information about the Benchmark Wizard:

Benchmark your BizTalk Server (Part 1) – Overview

Benchmark your BizTalk Server (Part 2) – How to install

Benchmark your BizTalk Server (Part 3) – Drill down and analyse

The most common feedback is related to the fact that it was very difficult to meet the KPI’s. A reason for this is that the original tests where executed while global tracking was disabled. This was unfortunate as the DTA tracking has ~30% overhead. We have done a re-run of all the tests, and updated the KPI’s for the new version (shown at the bottom of this page).

Further more, there seams to be a question of how to interpret the result. What does “Succeeded” mean? Does it reflect the best possible result or good enough? To make this more clear, we’ve implemented the “stop light approach”, where if you’ve Succeeded you should be proud of yourself and make a blog post, while an Acceptable result is nothing to be ashamed of.     

 

image

 

There has also been some other fixes such as the resetting all the counters when you re-run the test, and fixing the CPU counters to show correct values.

Special thanks to Microsoft and Ewan Fairweather for letting me use their test lab!

Download BizTalk Benchmark Wizard from CodePlex

Scenario KPI’s: Messaging Single and Multi Message Box

# of Bts Srv

#CPU/Bts Srv

# SQL Srv

#CPU/SQL Srv

Msgs/Sec

Msgs/Sec

1 1 Quad (1) (1) 160 200
1 1 Quad 1 1 Quad 280 350
1 1 Quad 1 2 Quad 390 490
1 1 Quad 1 2 Quad 560 700
2 1 Quad 1 2 Quad 620 770
2 2 Quad 1 2 Quad 730 910
2 2 Quad 1 4 Quad 780 980

Scenario KPI’s: Orchestration Single Message Box

# of Bts Srv

#CPU/Bts Srv

# SQL Srv

#CPU/SQL Srv

Msgs/Sec

Msgs/Sec

1 1 Quad (1) (1) 110 140
1 1 Quad 1 1 Quad 170 210
1 1 Quad 1 2 Quad 190 240
1 1 Quad 1 2 Quad 220 270
2 1 Quad 1 2 Quad 230 290
2 2 Quad 1 2 Quad 260 320
2 2 Quad 1 4 Quad 300 370

 

Test environment:

Type Model CPU Type # of CPUs # of Cores/CPU Architecture RAM Local Disks OS Software
Database DL785 Intel Xeon 8 x 2.4 Ghz 4 x64 130 GB 2 x 72gb* Win2k8 SP2 EE 64bit SQL Server 2008 SP1
BTS Receive host R805 Intel Xeon 2 x 2.33 Ghz 4 x64 8 GB 2 x 72gb 10k SAS Win2k8 SP2 EE 64bit BizTalk Server 2009
BTS Send host R805 Intel Xeon 2 x 2.33 Ghz 4 x64 8 GB 2 x 72gb 10k SAS Win2k8 SP2 EE 64bit BizTalk Server 2009
Load server R805 Intel Xeon 2 x 2.33 Ghz 4 x64 8 GB 2 x 72gb 10k SAS Win2k8 SP2 EE 64bit BizTalk Benchmark Wizard
Back-end server R805 Intel Xeon 2 x 2.33 Ghz 4 x64 8 GB 2 x 72gb 10k SAS Win2k8 SP2 EE 64bit Indigo Service

* Storage: EMC Clarion CX-240 ( 5 solid state drives )

Configuration:

Be aware of your Azure bill!

Be aware of your Azure bill!

I have been playing around with Azure since it was first released at the PDC. I have since then done several demos for Microsoft and my employer Logica. The demos have been pretty much the same demo over and over, why I stopped the project rather than deleting it.

image

To be fair, it’s pretty clear stated that this will continues to accrue charges, but if you’re doing demos like I have, -Make sure to delete it!!!

I got billed ~$700 since Feburuary.

Promote properties in a EDI schema using the EDI Disassembler

I’ve doing a lot of EDI related work in BizTalk lately and I have to say that I’ve really enjoyed it! EDI takes a while to get used to (see example below), but once one started to understand it I’ve found it to be a real nice, strict standard – with some cool features built into BizTalk!

UNB+IATB:1+6XPPC+LHPPC+940101:0950+1'
UNH+1+PAORES:93:1:IA'
MSG+1:45'
IFT+3+XYZCOMPANY AVAILABILITY'
ERC+A7V:1:AMD'
IFT+3+NO MORE FLIGHTS'
ODI'
TVL+240493:1000::1220+FRA+JFK+DL+400+C'
...

There are however some things that doesn’t work as expected …

Promoting values

According to the MSDN documentation.aspx) the EDI Disassembler by default promotes the following EDI fields: UNB2.1, UNB2.3, UNB3.1, UNB11; UNG1, UNG2.1, UNG3.1; UNH2.1, UNH2.2, UNH2.3.

There are however situation where one would like other values promoted.

I my case I wanted the C002/1001 value in the BGM segment. This is a value identifying the purpose of the document and I needed to route the incoming message based on the value.

The short version is that creating a property schema, promoting the field in the schema and having the EDI Disassembler promoting the value will not work (as with the XML Disassembler). To do this you’ll need to use a custom pipeline component to promote the value. Rikard Alard seem to have come to the same conclusion here.

Promote pipeline component to use

If you don’t want to spend time on writing your own pipeline component to do this yourself you can find a nice “promote component” on CodePlex here by Jan Eliasen.

If you however expect to receive lots and lots of big messages you might want to look into changing the component to use XPathReader and custom stream implementations in the Microsoft.BizTalk.Streaming.dll. You can find more detailed information on how to do that in this MSDN article.aspx).

Communication with all MessageBoxes has now been re-established

Communication with all MessageBoxes has now been re-established

Microsoft have released a hotfix for information messages in the BizTalk Server 2009 event log that read something like “Communication with MessageBox <message box database name> on SQL Instance <SQL server instance name> has been re-established”, with event ID 6999 followed by a “Communication with all MessageBoxes has now been re-established” information entry with event […]

Tagging Objects in the AppFabric Cache

Tagging Objects in the AppFabric Cache

In two of my previous entries I outlined functionality and patterns used in the AppFabric Cache.  In this entry I wanted to expand and look at another area of functionality that people have come to expect when working with cache technology. 


This expectation is the ability to tag content with more information than just the key.  As you start to examine this expectation you will soon find yourself asking if the tagged data can be related to each other and finally if it is possible to remove the related data.


One of the nice things about the AppFabric Cache is that it is so easy to work with and can be so simple.  Then, if needed, like with this topic, you can take advantage of the addition functionality.  This is where the real benefits of AppFabric Cache come into play over the other cache alternatives.


Lets take a deeper look at tagging.  Tagging is essentially assigning one or more string values to an object contained in the cache.   The object in the cache can then be retrieved by using the tag or multiple tags.   When you tag an object, all of the tagged objects will reside within a region.  A region is an additional container inside the cache.  Because regions are optional, if you want to use them you need to explicitly create them at runtime within your application code.  You can setup different regions for different types of related data.  When you utilize regions, you have the option to retrieve all objects in that region and therefore by default have setup a solution that now provides you access to all the dependent data.  We can take this one step further in that you can also delete all of the data within a region and not have to worry about looping through all entries to find the specific keys and their dependencies.


As an example we could create a region called ‘Beverages’ and then we could cache all of the beverages we sell and tag each item by the type of beverage such as Soft Drink, Wine, Beer.  We could even go one step further and provide multiple tags so that we could further segment the Wine category into White or Red or Merlot, Zinfandel, Riesling, etc.  At this point the application could retrieve all of the catalog items based on the search criteria that were entered.


Lets look at how we can setup the region and tags.  We will also look at the methods that are available to interact with objects using tags and lastly, how we can manage the data in the cache; adding, retrieving and deleting.


Setting Up a Region:


The code that follows assumes that you already now how to create a cache (either through code or through the PowerShell cmdlets (as found in my previous post here)).


To create a region we already need to have the cache created.  Once that is done, we can pass in the cache name, use the GetCache method and then call the CreateRegion method passing in a region name.  We can create as many regions as we need based on the manner in which you wish to segment the data.  One thing to keep in mind however is that there can be performance implications when using regions.  The code below shows how we can create the region.


private void CreateRegion(string CacheName, string RegionName)
{

    //This can also be kept in a config file
    var config = new DataCacheFactoryConfiguration();
    config.Servers =
new List<DataCacheServerEndpoint>
        {
            new DataCacheServerEndpoint(Environment.MachineName, 22233)
        };

    DataCacheFactory dcf = new DataCacheFactory(config);

   
if (dcf != null)
    {
        var dataCache = dcf.GetCache(CacheName);
        dataCache.CreateRegion(RegionName);
    }
}


 Now that we have a cache with a region created lets look at the methods that are available to interact with tags


Methods to work with tagging (from MSDN):






























Method Description

GetObjectsByTag


Provides a simple way to access objects that contain tags (exact match, intersection, or union). The region name is a required parameter.


GetObjectsByAnyTag


Returns a list of objects that have tags matching any of the tags provided in the parameter of this method.


GetObjectsByAllTags


Returns a list of objects that have tags matching all of the tags provided in the parameter of this method.


GetObjectsInRegion


Returns a list of all objects in a region. This method is useful when you do not know all the tags used in the region.


GetCacheItem


Returns a DataCacheItem object. In addition to the cached object and other information associated with the cached object, the DataCacheItem object also includes the associated tags.


Add


When adding an object to cache, this method supports associating tags with that item in the cache.


Put


When putting an object into cache, this method can be used to replace tags associated with a cached object.


Remove


This method deletes the cached object and any associated tags.


Managing the data in the cache


There are a number of methods to get objects out of the cache but logically I like to start in terms of adding to the cache first.  So, lets take a look at the put method.  The put method will update an object that already has a key that is contained in the cache (whereas the Add will return an exception if the key is already present).  The put method can also update or add new tags to an existing object in the cache.  As we look at the Put method signature below we can see that this version of the method accepts the key and value just as the other overrides of the Put method do but this one also adds on a collection of tags as well as the name of the region that the cached item will reside in.


public DataCacheItemVersion Put (
string key,
Object value,
IEnumerable<DataCacheTag> tags,
string region
)


The code below shows how we can use the Put method and include multiple tags


private void InsertCacheObjectWithTag(string CacheName, string RegionName)
{
   
//This can also be kept in a config file

   
var config = new DataCacheFactoryConfiguration();
    config.Servers =
new List<DataCacheServerEndpoint>
    {
       
new DataCacheServerEndpoint(Environment.MachineName, 22233)
    };

   
DataCacheFactory dcf = new DataCacheFactory(config);

   
if (dcf != null)
    {
       
List<DataCacheTag> tags = new List<DataCacheTag>
            {
               
new DataCacheTag(“Wine”),
               
new DataCacheTag(“Red”),
               
new DataCacheTag(“Merlot”)
            };

       
var dataCache = dcf.GetCache(CacheName);
       
        dataCache.Put(
“WineKey”, “WineValue”, tags, RegionName);
    }
}


Now we can look at the methods to retrieve objects.  There are four main Get methods.  We can get by tag, any tag, all tags and finally any object that exists in the region no matter what tag.  The GetObjectsInRegion method is what provides us the ability to implement a scenario in which all cached objects are related and can be treated as a group.  The related data can also be removed by called dataCache.RemoveRegion(RegionName) just as we called the CreateRegion method above.


For people that have been working with Memcached you can delete dependent data through the cas (check and set) operation.  The reason that I bring this up is that more people are familiar with Memcached and therefore I am asked if AppFabric Caching has this or that functionality.  What I am finding is that it has the same functionality and more.  It is just implemented a bit differently.


Anyways, if we want to retrieve all the objects in the cache that have a Wine tag we can use the GetObjectsByTag as shown below:


public IEnumerable<KeyValuePair<string, object>> GetLookUpCacheDataByTag(string TagValue)
{
   
var config = new DataCacheFactoryConfiguration();
    config.Servers =
new List<DataCacheServerEndpoint>
        {
           
new DataCacheServerEndpoint(Environment.MachineName, 22233)
        };

   
DataCacheFactory dcf = new DataCacheFactory(config);
   
   
var cache = dcf.GetCache(cacheName);

   
DataCacheTag dct = new DataCacheTag(TagValue);

   
return cache.GetObjectsByTag(dct, regionName);  //regionName could either be passed in or an internal variable
}


and lastly, we can remove items based on a tag.  I already mentioned that we can call the RemoveRegion method to remove all the related data that is grouped together in a region.  If there is a specific item that you want removed then you can call the Remove method and pass in the key.  If you wanted to delete based on a tag value then you would have to call one of the GetXXX methods, obtain the key, and then call the Remove method by passing the returned key value.