Why does an instance of XmlSerializer before calling ServiceHost.Open cause memory leak and processing

While searching for memory and handleleak in .NET / WCF / Windows Service, I noticed strange behavior that I cannot explain. Here is the setting and resolution. What I'm looking for will be an explanation of the observed behavior.

I installed the windows service.
I started the service.
I called a simple method a WCF transactional call (a new channel for each call - without caching).
For each call, about 2 handles remain in memory.

This can be observed if the following points apply:

  • This is a windows service; Do not run it as a console application.
  • Use a transaction (only for a single process or machine) to call the WCF method.
  • Before calling ServiceBase.Run(servicesToRun); run an instance of XmlSerializer with some type.
  • Type is a custom type. This does not happen with new XmlSerializer(typeof(string)) or new XmlSerializer(typeof(XmlDocument)) . No need to serialize. It is enough if the user type has only a string as a property (no descriptors anywhere!)
  • Creating static XmlSerialization.dll using i.e. SGen.exe will not cause this problem.

There is already a fix in my code:
Use XmlSerializer first of all in OnStart ():

Program.cs

 WindowsService winSvc = new WindowsService(); ServiceBase[] servicesToRun = new ServiceBase[]{winSvc}; ServiceBase.Run(servicesToRun); 

WindowsService.cs

 internal sealed class WindowsService : ServiceBase { private ServiceHost wcfServiceHost = null; internal WindowsService() { AutoLog = true; CanStop = true; CanShutdown = true; CanPauseAndContinue = false; } internal void StartWcfService() { wcfServiceHost = new ServiceHost(typeof(DemoService)); wcfServiceHost.Open(); } protected override void Dispose(bool disposing) { if (wcfServiceHost != null) { wcfServiceHost.Close(); } base.Dispose(disposing); } protected override void OnStart(string[] args) { new XmlSerializer(typeof(MyType)); StartWcfService(); } } 

DemoService.cs

 [ServiceBehavior ( InstanceContextMode = InstanceContextMode.PerSession, TransactionAutoCompleteOnSessionClose = false, IncludeExceptionDetailInFaults = true ) ] public sealed class DemoService : IDemoService { [TransactionFlow(TransactionFlowOption.Allowed)] [OperationBehavior(TransactionScopeRequired = true, TransactionAutoComplete = true)] public int Add(int a, int b) { return a + b; } } 

Client.cs

 IChannelFactory<IDemoService> channelFactory = new ChannelFactory<IDemoService>("defaultClientConfiguration"); IDisposable channel = null; for (int index = 0; index < 5000; index++) { using ( channel = (IDisposable)channelFactory.CreateChannel(new EndpointAddress("net.tcp://localhost:23456/DemoService"))) { IDemoService demoService = (IDemoService)channel; using (TransactionScope tx = new TransactionScope(TransactionScopeOption.RequiresNew)) { demoService.Add(3, 9); tx.Complete(); } ) } 

Can anyone explain this behavior?

Please note: I'm not interested in finding a way to avoid a leak (I already know how to do this), but in an explanation (i.e. WHY is this happening).

+8
c # xml-serialization wcf windows-services
source share
2 answers

I think that some of the internal actions make this question fair. I do this because of the back of my head, as I ran into this problem a while ago and it took me a day to track, including the extensive use of reflector and memory profiling ANTS (in my previous company) ... here goes:

What the internal XML serializer does, it creates a class (let it be called "A") using System.Reflection.Emit, which takes on the type you pass to it. Building such a class costs a lot of time relatively conditionally and can be reused, because the types do not change. Because of this, constructed types are stored in a dictionary, for example. it ends with some Dictionary.

For known (basic) types, the serializer code is fixed, for example. serialization of the string will not change no matter how many times you restart the application. Note the difference with “A,” where any type that is not known to serialize the factory until it is passed first to XMLSerializer.

The first time a type is used by XMLSerializer, this process is performed both for the type you are passing and for all types that it needs (for example, for all fields and properties that require serialization).

About the leak ... When you call ChannelFactory, it builds a serializer, if it does not already exist. To do this, it checks if the serializer exists in the dictionary, and if not, one instance of the ISomeSerializerType type is created.

For some stupid reason, there is an error in the factory that builds a new serializer without saving it in the dictionary. Once created, you get a new type - which appears as a leak (remember: types can never be unloaded) - even if the objects are correctly positioned. When you first use the XMLSerializer or create a static class, it correctly uses the dictionary cache, which means that it will not leak. So you have it, this is a mistake . I used to have access to the ANTS Memory Profiler, which showed it pretty well.

Hope this explains.

+7
source share

The XmlSerializer documentation says the following:

To improve performance, the XML serialization infrastructure dynamically generates assemblies for serializing and deserializing certain types. Infrastructure locates and reuses these assemblies. This only happens when using the following constructors:

XmlSerializer.XmlSerializer (type)

XmlSerializer.XmlSerializer (Type, String)

If you use any of the other constructors, several versions of the same assembly are generated and not unloaded, which leads to a memory leak and poor performance. The simplest solution is to use one of the previously mentioned two constructors. Otherwise, you must cache the assemblies in the Hashtable, as shown in the following example.

http://msdn.microsoft.com/en-us/library/system.xml.serialization.xmlserializer.aspx

+6
source share

All Articles