'The use of type .. is not supported with NetDataContractSerializer' - why is NetDataContractSerializer being used?

692 Views Asked by At

This error is caused because the NetDataContractSerializer (NetDCS) is being used, but the reference assembly is not shared and included in the client. This question is not about that.

This question is about preventing NetDataContractSerializer from being used; or at least finding out why it is being used by the client.

  • Why does the client use NetDCS when I have not configured it to do so?

    According to the articles/posts I have found this is an explicit step; I have not manually enabled this for the project or WCF client - Is there a machine / framework override somewhere besting me?

  • Can I force the client to use the normal DataContractSerializer (DCS)?

    If so, can I force this on a per-type or per-namespace basis? The generic framework "base types" are already in a referenced assembly; ideally I could keep these resolving with NetDCS - but I'd give up NetDCS all around to "fix" this issue.

    If not, are there any more drastic measures available such as munging the response / respones XML?

    (I am fine manually adding in any required contract resolvers and dealing with mangled generated names.)

Using the WCF Test client (which always uses DCS?) "works fine" which makes me believe the current issue is related to NetDCS deserialization that occurs when using the service from the project. I may be misunderstanding how the WCF Test client works.

Here is the error message (which can be resolved by adding the reference assemblies for NetDCS, but I do not want to do that) to make sure I am not misunderstanding the issue. I am 95% sure this is on deserialization in the client.

The formatter threw an exception while trying to deserialize the message ..

'The use of type '..' as a get-only collection is not supported with NetDataContractSerializer. Consider marking the type with the CollectionDataContractAttribute attribute or the SerializableAttribute attribute or adding a setter to the property.'

And the stack summary:

Server stack trace: 
   at System.ServiceModel.Channels.ServiceChannel.ThrowIfFaultUnderstood(Message reply, MessageFault fault, String action, MessageVersion version, FaultConverter faultConverter)
   at System.ServiceModel.Channels.ServiceChannel.HandleReply(ProxyOperationRuntime operation, ref ProxyRpc rpc)
   at System.ServiceModel.Channels.ServiceChannel.Call(String action, Boolean oneway, ProxyOperationRuntime operation, Object[] ins, Object[] outs, TimeSpan timeout)
   at System.ServiceModel.Channels.ServiceChannelProxy.InvokeService

The WCF client configuration is the the "simple" default configuration and there is no code that modifies behaviors at run-time.

<system.serviceModel>
    <bindings>
      <basicHttpBinding>
        <binding name="BasicHttpBinding_myServiceBinding" />
      </basicHttpBinding>
    </bindings>
    <client>
      <endpoint address="endPoint.svc"
          binding="basicHttpBinding" bindingConfiguration="BasicHttpBinding_myServiceBinding"
          contract="MyServive.IMyServiceEndPoint" name="BasicHttpBinding_serviceEndPoint" />
    </client>
  </system.serviceModel>

The Service Reference is generated in VS 2013 and the project targets .NET 4. The project with the service reference and the test project are both simple class libraries - ie. they are not hosted in IIS.

1

There are 1 best solutions below

1
On

You can apply a XmlSerializerFormatAttribute (http://msdn.microsoft.com/en-us/library/system.servicemodel.xmlserializerformatattribute.aspx) to force your class to use the XML serializer.