by community-syndication | May 14, 2008 | BizTalk Community Blogs via Syndication
Michele recently wrote a lengthy but deliciously thorough article on MSDN (hat tip: Lynn) called Application Deployment Strategies which identifies five core scenarios for using Windows Communication Foundation within a distributed application. Those scenarios include:
Enterprise web services. Think secure, interoperable services that may also utilize new-ish WS* standards such as WS-AtomicTransaction and WS-ReliableMessaging.
Web 2.0 services. […]
by community-syndication | May 14, 2008 | BizTalk Community Blogs via Syndication
Michele recently wrote a lengthy but deliciously thorough article on MSDN (hat tip: Lynn) called Application Deployment Strategies which identifies five core scenarios for using Windows Communication Foundation within a distributed application. Those scenarios include:
Enterprise web services. Think secure, interoperable services that may also utilize new-ish WS* standards such as WS-AtomicTransaction and WS-ReliableMessaging.
Web 2.0 services. […]
by community-syndication | May 14, 2008 | BizTalk Community Blogs via Syndication
Another new WCF feature that’s part of .NET 3.5 SP1 has to do with better support for object references. DataContractSerializer has always supported serializing object references and dealing with graphs, including cycles, and not just simple trees. But doing so is not the default behavior — you have to tell DataContractSerializer that you want it to preserve object references when you instantiate it.
Let’s look at a simple example. Supposed that you have the following cyclic object graph (I’m assuming the same Person type that I used in my previous post):
Person p = new Person();
p.Id = “123”;
p.Name = “Aaron”;
p.Spouse = new Person();
p.Spouse.Id = “456”;
p.Spouse.Name = “Monica”;
p.Spouse.Spouse = p;
…
And now let’s supposed that you want to serialize it. If you create the DataContractSerializer using the default constructor, it will throw an exception when it identifies the cycle during serialization. However, you can tell DataContractSerializer to preserve object references using one of the other constructors:
DataContractSerializer dcs = new DataContractSerializer(typeof(Person),
null, int.MaxValue, false, true /* preserve object refs */, null);
using (FileStream fs = new FileStream(“person.xml”, FileMode.Create))
{
dcs.WriteObject(fs, p);
}
The resulting person.xml file now looks like this:
<Person z:Id=“1“ xmlns=“http://schemas.datacontract.org/2004/07/SerializationSp1“ xmlns:i=“http://www.w3.org/2001/XMLSchema-instance“ xmlns:z=“http://schemas.microsoft.com/2003/10/Serialization/“>
<Id z:Id=“2“>123</Id>
<Name z:Id=“3“>Aaron</Name>
<Spouse z:Id=“4“>
<Id z:Id=“5“>456</Id>
<Name z:Id=“6“>Monica</Name>
<Spouse z:Ref=“1“ i:nil=“true“/>
</Spouse>
</Person>
Notice that each reference type has been given an “Id” attribute and the nested Spouse reference refers back to the containing Person via the Ref attribute, thereby preserving the references within the XML.
Now, as of SP1, the definitions for the Id/Ref attributes are now part of the generated schema. If you run SvcUtil.exe /dconly over the assembly containing Person, it will produce a schema file for the “http://schemas.microsoft.com/2003/10/Serialization” namespace. And within that schema, you’ll find the following definitions for Id/Ref, which are defined as ID/IDREF types:
…
<xs:attribute name=“Id“ type=“xs:ID“ />
<xs:attribute name=“Ref“ type=“xs:IDREF“ />
…
ID and IDREF are standard DTD/XSD types that are widely supported across platforms.
One problem with employing this object-reference-preservation technique is that you don’t have direct control over how the DataContractSerializer is constructed when defining your WCF services. You can, however, implement a behavior that intercepts the standard serializer creation process so that you can enable this feature. Sowmy provides a complete example of how to accomplish this over on his blog.
by community-syndication | May 14, 2008 | BizTalk Community Blogs via Syndication
Another new WCF feature that’s part of .NET 3.5 SP1 has to do with better support for object references. DataContractSerializer has always supported serializing object references and dealing with graphs, including cycles, and not just simple trees. But doing so is not the default behavior — you have to tell DataContractSerializer that you want it to preserve object references when you instantiate it.
Let’s look at a simple example. Supposed that you have the following cyclic object graph (I’m assuming the same Person type that I used in my previous post):
Person p = new Person();
p.Id = “123”;
p.Name = “Aaron”;
p.Spouse = new Person();
p.Spouse.Id = “456”;
p.Spouse.Name = “Monica”;
p.Spouse.Spouse = p;
…
And now let’s supposed that you want to serialize it. If you create the DataContractSerializer using the default constructor, it will throw an exception when it identifies the cycle during serialization. However, you can tell DataContractSerializer to preserve object references using one of the other constructors:
DataContractSerializer dcs = new DataContractSerializer(typeof(Person),
null, int.MaxValue, false, true /* preserve object refs */, null);
using (FileStream fs = new FileStream(“person.xml”, FileMode.Create))
{
dcs.WriteObject(fs, p);
}
The resulting person.xml file now looks like this:
<Person z:Id=“1“ xmlns=“http://schemas.datacontract.org/2004/07/SerializationSp1“ xmlns:i=“http://www.w3.org/2001/XMLSchema-instance“ xmlns:z=“http://schemas.microsoft.com/2003/10/Serialization/“>
<Id z:Id=“2“>123</Id>
<Name z:Id=“3“>Aaron</Name>
<Spouse z:Id=“4“>
<Id z:Id=“5“>456</Id>
<Name z:Id=“6“>Monica</Name>
<Spouse z:Ref=“1“ i:nil=“true“/>
</Spouse>
</Person>
Notice that each reference type has been given an “Id” attribute and the nested Spouse reference refers back to the containing Person via the Ref attribute, thereby preserving the references within the XML.
Now, as of SP1, the definitions for the Id/Ref attributes are now part of the generated schema. If you run SvcUtil.exe /dconly over the assembly containing Person, it will produce a schema file for the “http://schemas.microsoft.com/2003/10/Serialization” namespace. And within that schema, you’ll find the following definitions for Id/Ref, which are defined as ID/IDREF types:
…
<xs:attribute name=“Id“ type=“xs:ID“ />
<xs:attribute name=“Ref“ type=“xs:IDREF“ />
…
ID and IDREF are standard DTD/XSD types that are widely supported across platforms.
One problem with employing this object-reference-preservation technique is that you don’t have direct control over how the DataContractSerializer is constructed when defining your WCF services. You can, however, implement a behavior that intercepts the standard serializer creation process so that you can enable this feature. Sowmy provides a complete example of how to accomplish this over on his blog.
by community-syndication | May 13, 2008 | BizTalk Community Blogs via Syndication
I came across an unusual requirement earlier this year while at a leading golfing industry company: We needed to retrieve current tournament standings from the PGA Web site, pulling scores and statistics from them by FTP every 2 minutes while the tournaments were in progress. Ultimately, the standings would trickle through to a data-driven Web site. The interesting part here was the pickup location, there was a predefined FTP folder structure, with a separate folder for each tournament (the files were always the same name, but the locations would vary). We could have potentially multiple receive locations that become active during a given week, but only for a finite period of time, yet all pointing at different FTP locations. BizTalk has dynamic send ports of course, but no dynamic receive locations. How to handle this? Clearly, although manually managing all this was possible, it really wasn’t a viable solution given the amount of on-going effort it would require.
I found out that there was an overall tournament schedule file that was posted on a weekly basis, and that’s how I solved it. The solution basically:
- retrieves a schedule file (on a weekly basis) and publishes the schedule to the message box
- a CreateReveiveLocations orchestration subscribes to schedule files, and picks it up
- the orchestration maps the schedule to a structure that is easier to work with (we map inside the orchestration as we thought we may have other interesting things we could do with the schedule file in a future release)
- the orchestration removes any previously-created dynamic receive locations (based on a standard naming pattern, and they all belong to a known receive port)
- the orchestration calls a helper class that creates all the FTP receive locations. We know when the tournament starts, so:
- we only create receive locations for current and future tournaments
- we set the start date for the tournament to be the day of the tournament, so no polling will happen before the tournament starts
How cool is that? It’s very cool to drop an XML document into BizTalk, and watch it create a bunch (and there are a LOT) of receive locations that it will subsequently use to drive a process. I’m all in favor of self-configuring environments!
We were in POC mode, and this took me a couple of days to get it all done (using good naming conventions, etc of course), and get it all working end-to-end. I’m sure I could have come up with a pretty good way to do this just in code, but it would have taken a LOT longer that 2 days.
Here’s what the orchestration looks like:
There are two calls to a helper class above. The first removes any receive locations that were previously created (I use a naming convention and a unique name suffix, so I know which ones are dynamic). The second call passes in all the information required to create the new batch of receive locations.
The schedule file contained a tournament date. Because of that, we can skip creating FTP receives for tournaments that have ended.
Drilling into the CreateFtpReceiveLocation code, this snippet :
- first gets a reference to the receive port (which has a well know name)
- creates a new receive location
- sets the transport type to FTP
Note that if this were not a POC, I would have cached the ReceiveHandler and done other optimizations, however, even without optimizations, this creates hundreds of receive locations in seconds.
This last snippet:
- assigns the pipeline to use for this receive location
- set receive locations
- saves the changes to the BizTalk management database
There you have it. With a trivial amount of code, I eliminated what would have been significant ongoing manual intervention in the process. I fully automated what would have been a very tedious, and error-prone, task. The client was thrilled as this was a very elegant solution to an ongoing problem.
For anyone after the code rather that pretty pictures of it :), please see below.
Technorati Tags: BizTalk,SOA,Dynamic Receive,ExplorerOM
—————————–
public static class LocationManagement
{
const string CONNECTION_STRING = “Integrated Security=SSPI;Persist Security Info=False;Initial Catalog=BizTalkMgmtDb;Server=(local)”;
public static void CreateFtpReceiveLocations(string receivePortName, XmlDocument receiveLocations, string prefix, string server, string addressTemplate, string uid, string pwd)
{
XmlNodeList locs = receiveLocations.SelectNodes(“//ReceiveLocation”);
DateTime cutoffDate = System.DateTime.Now.Add(new TimeSpan(-7,0,0,0));
foreach (XmlNode loc in locs)
{
// if start date is more than 7 days ago, we don’t need this one any more
if (DateTime.Parse(loc.Attributes[“StartDate”].Value) > cutoffDate)
{
CreateFtpReceiveLocation(receivePortName, prefix + loc.Attributes[“Address”].Value, string.Format(addressTemplate, loc.Attributes[“Address”].Value), server, uid, pwd, DateTime.Parse(loc.Attributes[“StartDate”].Value));
}
}
}
public static bool CreateFtpReceiveLocation(string receivePortName, string receiveLocationName, string address, string server, string uid, string pwd, DateTime startDate)
{
try
{
BtsCatalogExplorer root = new BtsCatalogExplorer();
root.ConnectionString = CONNECTION_STRING;
//GetReceive Port
ReceivePort receivePort = root.ReceivePorts[receivePortName];
//Create a new receive location and add it to the receive port
ReceiveLocation myreceiveLocation = receivePort.AddNewReceiveLocation(); ;
myreceiveLocation.Address = address;
myreceiveLocation.Name = receiveLocationName;
//Receive Handler
foreach (ReceiveHandler handler in root.ReceiveHandlers)
{
if (handler.TransportType.Name == “FTP”)
{
myreceiveLocation.ReceiveHandler = handler;
break;
}
}
//Associate a transport protocol and URI with the receive location.
ProtocolType protocol = root.ProtocolTypes[“FTP”];
myreceiveLocation.TransportType = protocol;
Pipeline pipeline = root.Pipelines[“Microsoft.BizTalk.DefaultPipelines.PassThruReceive”];
myreceiveLocation.ReceivePipeline = pipeline;
myreceiveLocation.StartDate = startDate;
myreceiveLocation.StartDateEnabled = true;
string ReceiveLocationTransportTypeData = “<CustomProps><AdapterConfig vt=\”8\”><Config xmlns:xsi=\”http://www.w3.org/2001/XMLSchema-instance\” xmlns:xsd=\”http://www.w3.org/2001/XMLSchema\”><uri>ftp://BTS2006Dev:21/ClientName.PGA/*.xml</uri><serverAddress>BTS2006Dev</serverAddress><serverPort>21</serverPort><userName>administrator</userName><password>******</password><accountName /><fileMask>*.xml</fileMask><targetFolder>ClientName.PGA</targetFolder><representationType>binary</representationType><maximumBatchSize>0</maximumBatchSize><maximumNumberOfFiles>0</maximumNumberOfFiles><passiveMode>False</passiveMode><firewallType>NoFirewall</firewallType><firewallPort>21</firewallPort><pollingUnitOfMeasure>Seconds</pollingUnitOfMeasure><pollingInterval>60</pollingInterval><errorThreshold>10</errorThreshold><maxFileSize>100</maxFileSize></Config></AdapterConfig></CustomProps>”;
XmlDocument docTransportTypeData = new XmlDocument();
docTransportTypeData.LoadXml(System.Web.HttpUtility.HtmlDecode(ReceiveLocationTransportTypeData));
docTransportTypeData.SelectSingleNode(“//userName”).InnerText = uid;
docTransportTypeData.SelectSingleNode(“//password”).InnerText = pwd;
docTransportTypeData.SelectSingleNode(“//uri”).InnerText = address;
docTransportTypeData.SelectSingleNode(“//serverAddress”).InnerText = server;
myreceiveLocation.TransportTypeData = ReceiveLocationTransportTypeData;
//Enable the receive location.
myreceiveLocation.Enable = true;
//Save changes
root.SaveChanges();
return true;
}
catch (Exception ex)
{
throw new Exception(“Error creating receive location”, ex);
}
}
by community-syndication | May 13, 2008 | BizTalk Community Blogs via Syndication
I came across an unusual requirement earlier this year while at a leading golfing industry company: We needed to retrieve current tournament standings from the PGA Web site, pulling scores and statistics from them by FTP every 2 minutes while the tournaments were in progress. Ultimately, the standings would trickle through to a data-driven Web site. The interesting part here was the pickup location, there was a predefined FTP folder structure, with a separate folder for each tournament (the files were always the same name, but the locations would vary). We could have potentially multiple receive locations that become active during a given week, but only for a finite period of time, yet all pointing at different FTP locations. BizTalk has dynamic send ports of course, but no dynamic receive locations. How to handle this? Clearly, although manually managing all this was possible, it really wasn’t a viable solution given the amount of on-going effort it would require.
I found out that there was an overall tournament schedule file that was posted on a weekly basis, and that’s how I solved it. The solution basically:
- retrieves a schedule file (on a weekly basis) and publishes the schedule to the message box
- a CreateReveiveLocations orchestration subscribes to schedule files, and picks it up
- the orchestration maps the schedule to a structure that is easier to work with (we map inside the orchestration as we thought we may have other interesting things we could do with the schedule file in a future release)
- the orchestration removes any previously-created dynamic receive locations (based on a standard naming pattern, and they all belong to a known receive port)
- the orchestration calls a helper class that creates all the FTP receive locations. We know when the tournament starts, so:
- we only create receive locations for current and future tournaments
- we set the start date for the tournament to be the day of the tournament, so no polling will happen before the tournament starts
How cool is that? It’s very cool to drop an XML document into BizTalk, and watch it create a bunch (and there are a LOT) of receive locations that it will subsequently use to drive a process. I’m all in favor of self-configuring environments!
We were in POC mode, and this took me a couple of days to get it all done (using good naming conventions, etc of course), and get it all working end-to-end. I’m sure I could have come up with a pretty good way to do this just in code, but it would have taken a LOT longer that 2 days.
Here’s what the orchestration looks like:
There are two calls to a helper class above. The first removes any receive locations that were previously created (I use a naming convention and a unique name suffix, so I know which ones are dynamic). The second call passes in all the information required to create the new batch of receive locations.
The schedule file contained a tournament date. Because of that, we can skip creating FTP receives for tournaments that have ended.
Drilling into the CreateFtpReceiveLocation code, this snippet :
- first gets a reference to the receive port (which has a well know name)
- creates a new receive location
- sets the transport type to FTP
Note that if this were not a POC, I would have cached the ReceiveHandler and done other optimizations, however, even without optimizations, this creates hundreds of receive locations in seconds.
This last snippet:
- assigns the pipeline to use for this receive location
- set receive locations
- saves the changes to the BizTalk management database
There you have it. With a trivial amount of code, I eliminated what would have been significant ongoing manual intervention in the process. I fully automated what would have been a very tedious, and error-prone, task. The client was thrilled as this was a very elegant solution to an ongoing problem.
For anyone after the code rather that pretty pictures of it :), please see below.
Technorati Tags: BizTalk,SOA,Dynamic Receive,ExplorerOM
—————————–
public static class LocationManagement
{
const string CONNECTION_STRING = “Integrated Security=SSPI;Persist Security Info=False;Initial Catalog=BizTalkMgmtDb;Server=(local)”;
public static void CreateFtpReceiveLocations(string receivePortName, XmlDocument receiveLocations, string prefix, string server, string addressTemplate, string uid, string pwd)
{
XmlNodeList locs = receiveLocations.SelectNodes(“//ReceiveLocation”);
DateTime cutoffDate = System.DateTime.Now.Add(new TimeSpan(-7,0,0,0));
foreach (XmlNode loc in locs)
{
// if start date is more than 7 days ago, we don’t need this one any more
if (DateTime.Parse(loc.Attributes[“StartDate”].Value) > cutoffDate)
{
CreateFtpReceiveLocation(receivePortName, prefix + loc.Attributes[“Address”].Value, string.Format(addressTemplate, loc.Attributes[“Address”].Value), server, uid, pwd, DateTime.Parse(loc.Attributes[“StartDate”].Value));
}
}
}
public static bool CreateFtpReceiveLocation(string receivePortName, string receiveLocationName, string address, string server, string uid, string pwd, DateTime startDate)
{
try
{
BtsCatalogExplorer root = new BtsCatalogExplorer();
root.ConnectionString = CONNECTION_STRING;
//GetReceive Port
ReceivePort receivePort = root.ReceivePorts[receivePortName];
//Create a new receive location and add it to the receive port
ReceiveLocation myreceiveLocation = receivePort.AddNewReceiveLocation(); ;
myreceiveLocation.Address = address;
myreceiveLocation.Name = receiveLocationName;
//Receive Handler
foreach (ReceiveHandler handler in root.ReceiveHandlers)
{
if (handler.TransportType.Name == “FTP”)
{
myreceiveLocation.ReceiveHandler = handler;
break;
}
}
//Associate a transport protocol and URI with the receive location.
ProtocolType protocol = root.ProtocolTypes[“FTP”];
myreceiveLocation.TransportType = protocol;
Pipeline pipeline = root.Pipelines[“Microsoft.BizTalk.DefaultPipelines.PassThruReceive”];
myreceiveLocation.ReceivePipeline = pipeline;
myreceiveLocation.StartDate = startDate;
myreceiveLocation.StartDateEnabled = true;
string ReceiveLocationTransportTypeData = “<CustomProps><AdapterConfig vt=\”8\”><Config xmlns:xsi=\”http://www.w3.org/2001/XMLSchema-instance\” xmlns:xsd=\”http://www.w3.org/2001/XMLSchema\”><uri>ftp://BTS2006Dev:21/ClientName.PGA/*.xml</uri><serverAddress>BTS2006Dev</serverAddress><serverPort>21</serverPort><userName>administrator</userName><password>******</password><accountName /><fileMask>*.xml</fileMask><targetFolder>ClientName.PGA</targetFolder><representationType>binary</representationType><maximumBatchSize>0</maximumBatchSize><maximumNumberOfFiles>0</maximumNumberOfFiles><passiveMode>False</passiveMode><firewallType>NoFirewall</firewallType><firewallPort>21</firewallPort><pollingUnitOfMeasure>Seconds</pollingUnitOfMeasure><pollingInterval>60</pollingInterval><errorThreshold>10</errorThreshold><maxFileSize>100</maxFileSize></Config></AdapterConfig></CustomProps>”;
XmlDocument docTransportTypeData = new XmlDocument();
docTransportTypeData.LoadXml(System.Web.HttpUtility.HtmlDecode(ReceiveLocationTransportTypeData));
docTransportTypeData.SelectSingleNode(“//userName”).InnerText = uid;
docTransportTypeData.SelectSingleNode(“//password”).InnerText = pwd;
docTransportTypeData.SelectSingleNode(“//uri”).InnerText = address;
docTransportTypeData.SelectSingleNode(“//serverAddress”).InnerText = server;
myreceiveLocation.TransportTypeData = ReceiveLocationTransportTypeData;
//Enable the receive location.
myreceiveLocation.Enable = true;
//Save changes
root.SaveChanges();
return true;
}
catch (Exception ex)
{
throw new Exception(“Error creating receive location”, ex);
}
}
by community-syndication | May 13, 2008 | BizTalk Community Blogs via Syndication
WCF: Technique of debugging inconsistency in Wsdl and Response messages.
I have consumed severalWCF services whereWSDLs do not conform the Response messagess (See below. I’ve bolded the sample text related to one issue.).We canfix the issues by changing proxy code. It’s not a big issue but now we must manually change proxy every time we have updated the proxy of this service.
Here the checklist howwe debug this case:
[I’ve created the proxy for the web-service using any method in Visual Studio 2008 like “Add Service Reference” command.]
1. We have found that the client code cannot get all data from the service. Some data are lost.
2. Using SoapUI make sure the Response returned all data. This is a sign of this inconsistency!
3. Using debugger we havefound that data are lost exactly in
response = client.Method(request)
line in the client code.
What does it mean? The data is successfully returned in the Response message. But the client proxy code cannot process it. When the proxy code is deserializing the response message some data are lost. SoapUI does not process returned data, it just shows them as raw text. My proxy code is trying to deserialize this text, convert text to objects in memory. [The deserializer is the code in the Microsoft libraries in the System.Runtime.Serialization namespace. There are two of them in WCF: XmlSerializer and DataContractSerializer.] Some of returned data disappeared during the Deserializing process. These cases are described in article “WCF: values disappeared in response: derived classes and serialization/deserialization order error” [http://geekswithblogs.net/LeonidGaneline/archive/2008/05/01/wcf-values-disappeared-in-response-derived-classes-and-serializationdeseriazlization-order.aspx]
4. We have to openthe proxy source code (maybe in Reference.cs file) and fix code (see below). That makes it inconsistent with Wsdl file and next time when you update this Web or Service reference you have to fix code again. But your client is fixed now!
5. Document this fix! And do not make it in proxy code. J
You are lucky if you have gotinconsistency in Response messages. The worst situation if inconsistency is in Request messages. There are only two options: the service does not care about these inconsistencies and you are lucky again, or service rejects your request with a fault message. Moreover I make sure this message does not give you any clue what is going on. Just try to get the “correct” request from any source.
[WSDL ]
<xsd:complexType name=“ArrayOf_xsd_string“>
<xsd:sequence>
<xsd:element minOccurs=“0“ maxOccurs=“unbounded“ name=“item“ type=“xsd:string“ />
</xsd:sequence>
</xsd:complexType>
<xsd:complexType name=“PromoDetails“>
<xsd:sequence>
<xsd:element name=“assetTypes“ nillable=“true“ type=“impl:ArrayOf_xsd_string“ />
</xsd:sequence>
[Response message]
<assetTypes>
<assetTypes>Album</ assetTypes >
</assetTypes>
[Response message How it should be with regard to WSDL]
<assetTypes>
<item>Album</ item >
</assetTypes>
[Auto generated Proxy code]
private string[] assetTypesField;
[System.Xml.Serialization.XmlArrayAttribute(IsNullable=true, Order=1)]
[System.Xml.Serialization.XmlArrayItemAttribute(“item”, IsNullable = false)]
public string[] assetTypes {
get {
return this.assetTypesField;
}
set {
this.assetTypesField = value;
this.RaisePropertyChanged(“assetTypes”);
}
}
[Fixed Proxy code]
private string[] assetTypesField;
[System.Xml.Serialization.XmlArrayAttribute(IsNullable=true, Order=1)]
[System.Xml.Serialization.XmlArrayItemAttribute(“assetTypes “, IsNullable = false)]
public string[] assetTypes {
get {
return this.assetTypesField;
}
set {
this.assetTypesField = value;
this.RaisePropertyChanged(“assetTypes”);
}
}
by community-syndication | May 13, 2008 | BizTalk Community Blogs via Syndication
WCF: Technique of debugging inconsistency in Wsdl and Response messages.
I have consumed severalWCF services whereWSDLs do not conform the Response messagess (See below. I’ve bolded the sample text related to one issue.).We canfix the issues by changing proxy code. It’s not a big issue but now we must manually change proxy every time we have updated the proxy of this service.
Here the checklist howwe debug this case:
[I’ve created the proxy for the web-service using any method in Visual Studio 2008 like “Add Service Reference” command.]
1. We have found that the client code cannot get all data from the service. Some data are lost.
2. Using SoapUI make sure the Response returned all data. This is a sign of this inconsistency!
3. Using debugger we havefound that data are lost exactly in
response = client.Method(request)
line in the client code.
What does it mean? The data is successfully returned in the Response message. But the client proxy code cannot process it. When the proxy code is deserializing the response message some data are lost. SoapUI does not process returned data, it just shows them as raw text. My proxy code is trying to deserialize this text, convert text to objects in memory. [The deserializer is the code in the Microsoft libraries in the System.Runtime.Serialization namespace. There are two of them in WCF: XmlSerializer and DataContractSerializer.] Some of returned data disappeared during the Deserializing process. These cases are described in article “WCF: values disappeared in response: derived classes and serialization/deserialization order error” [http://geekswithblogs.net/LeonidGaneline/archive/2008/05/01/wcf-values-disappeared-in-response-derived-classes-and-serializationdeseriazlization-order.aspx]
4. We have to openthe proxy source code (maybe in Reference.cs file) and fix code (see below). That makes it inconsistent with Wsdl file and next time when you update this Web or Service reference you have to fix code again. But your client is fixed now!
5. Document this fix! And do not make it in proxy code. J
You are lucky if you have gotinconsistency in Response messages. The worst situation if inconsistency is in Request messages. There are only two options: the service does not care about these inconsistencies and you are lucky again, or service rejects your request with a fault message. Moreover I make sure this message does not give you any clue what is going on. Just try to get the “correct” request from any source.
[WSDL ]
<xsd:complexType name=“ArrayOf_xsd_string“>
<xsd:sequence>
<xsd:element minOccurs=“0“ maxOccurs=“unbounded“ name=“item“ type=“xsd:string“ />
</xsd:sequence>
</xsd:complexType>
<xsd:complexType name=“PromoDetails“>
<xsd:sequence>
<xsd:element name=“assetTypes“ nillable=“true“ type=“impl:ArrayOf_xsd_string“ />
</xsd:sequence>
[Response message]
<assetTypes>
<assetTypes>Album</ assetTypes >
</assetTypes>
[Response message How it should be with regard to WSDL]
<assetTypes>
<item>Album</ item >
</assetTypes>
[Auto generated Proxy code]
private string[] assetTypesField;
[System.Xml.Serialization.XmlArrayAttribute(IsNullable=true, Order=1)]
[System.Xml.Serialization.XmlArrayItemAttribute(“item”, IsNullable = false)]
public string[] assetTypes {
get {
return this.assetTypesField;
}
set {
this.assetTypesField = value;
this.RaisePropertyChanged(“assetTypes”);
}
}
[Fixed Proxy code]
private string[] assetTypesField;
[System.Xml.Serialization.XmlArrayAttribute(IsNullable=true, Order=1)]
[System.Xml.Serialization.XmlArrayItemAttribute(“assetTypes “, IsNullable = false)]
public string[] assetTypes {
get {
return this.assetTypesField;
}
set {
this.assetTypesField = value;
this.RaisePropertyChanged(“assetTypes”);
}
}
by community-syndication | May 13, 2008 | BizTalk Community Blogs via Syndication
If you want all resources of your BizTalk application to be registered in GAC on MSI import, it is necessary, prior to exporting BizTalk application to MSI file, to check ‘Add to the global assembly cache on MSI imports (gacutil)’ option in Modify Resources dialog box. This option is unchecked by default. To avoid the […]
by community-syndication | May 13, 2008 | BizTalk Community Blogs via Syndication
One of the new WCF features in .NET 3.5 SP1 is that DataContractSerializer now supports serializing types that aren’t annotated with any serialization attributes like [DataContract] or [Serializable].
If you were using DataContractSerializer prior to SP1, you had to follow the rules outline by Sowmy here. These rules illustrate that for custom classes you have a few choices. You can annotate the class with [DataContract] and [DataMember] to define an attribute-based mapping or implement IXmlSerializable to define a custom mapping. Or you can annotate the class with [Serializable] to automatically map all fields (like with .NET Remoting) or implement ISerializabe to take things into your own hands (assuming IXmlSerializable wasn’t used).
As you can see from the rules, there is no allowance for types that haven’t been annotated with one of these serialization attributes or that implement one of the serialization-related interfaces, or in other words, you can’t serialize “plain old C# objects“ (POCO for short).
The support for [Serializable] provided a nice migration path for traditional .NET Remoting types, which was nice, but the lack of support for POCO types meant you couldn’t move your ASP.NET Web services (ASMX) types over to the DataContractSerializer without sprinkling a bunch of new attributes on them.
Now, with .NET 3.5 SP1, you can serialize the following type with DataContractSerializer:
public class Person
{
public Person() { this.Id = Guid.NewGuid().ToString(); }
private string Id;
public string Name;
public Person Spouse;
}
For POCO types, DataContractSerializer only includes the public read/write fields and properties into the resulting XML Infoset. So in our example above, the private Id field won’t make it into the message.
Now you can simply use POCO types in your WCF service contracts and you don’t have to worry about changing the serializer back to XmlSerializer using [XmlSerializerFormat]. In other words, the following service contract works with the above type as-is in .NET 3.5 SP1:
[ServiceContract]
public interface IMarriageService
{
[OperationContract]
void MarryPeople(Person p1, Person p2);
}
Here’s a simple console program that uses DataContractSerializer to serialize a Person object:
class Program
{
static void Main(string[] args)
{
Person p = new Person();
p.Name = “Aaron”;
p.Spouse = new Person();
p.Spouse.Name = “Monica”;
DataContractSerializer dcs = new DataContractSerializer(typeof(Person));
using (FileStream fs = new FileStream(“person.xml”, FileMode.Create))
{
dcs.WriteObject(fs, p);
}
}
}
And here’s what the resulting person.xml file looks like:
<Person xmlns=“http://schemas.datacontract.org/2004/07/SerializationSp1“
xmlns:i=“http://www.w3.org/2001/XMLSchema-instance“>
<Name>Aaron</Name>
<Spouse>
<Name>Monica</Name>
<Spouse i:nil=“true“/>
</Spouse>
</Person>
When you use this technique, you have to be happy with the XML that DataContractSerializer gives you. In other words, you can’t customize the resulting XML in any way.
As soon as you place the [DataContract] attribute on the class, DataContractSerializer will only include fields/properties annotated with [DataMember] once again. Or if you apply the [Serializable] attribute, it will fall back to the standard [Serializable] mapping as well. For example, suppose I annotate the Person type with [DataContract]:
[DataContract]
public class Person
{
public Person() { this.Id = Guid.NewGuid().ToString(); }
private string Id;
public string Name;
public Person Spouse;
}
If I run my console program again, the resulting person.xml now looks like this:
<Person xmlns=“http://schemas.datacontract.org/2004/07/SerializationSp1“
xmlns:i=“http://www.w3.org/2001/XMLSchema-instance“/>
Notice that none of the fields were serialized because they weren’t annotated with [DataMember]. Once I applied [DataContract], DataContractSerializer no longer treated it like a POCO type.
To summarize, DataContractSerializer provides several different mechanisms for defining the serialization mapping:
1. Simply rely on the public interface (POCO types) with and take the default XML mapping
2. Use [Serializable] to only include fields
3. Use [DataContract] and [DataMember] and apply some basic customization
4. Use IXmlSerializable or ISerializable for advanced mapping customization
I was actually quite surprised to learn that they added this feature because it goes against the main reason for the original [DataContract] design (“boundaries are explicit”), at least according to the team in early design reviews. I asked for this feature (an implicit mapping) back then but got shot down for that very reason. But despite whatever principle it may violate, I like it, because it makes it simpler for folks to get up and running with WCF and it provides an easy migration path for your ASMX types.