25% increase in WCF CPU usage per client

As the header says, I got a WCF server that has this service behavior defined:

[ServiceBehavior(InstanceContextMode = InstanceContextMode.Single, ConcurrencyMode = ConcurrencyMode.Multiple)]

I use named pipe binding and my clients connect like this:

    NetNamedPipeBinding binding = new NetNamedPipeBinding();
const int maxValue = 0x40000000; // 1GB
binding.MaxBufferSize = maxValue; 
binding.MaxReceivedMessageSize = maxValue;

binding.ReaderQuotas.MaxArrayLength = maxValue;
binding.ReaderQuotas.MaxBytesPerRead = maxValue;
binding.ReaderQuotas.MaxStringContentLength = maxValue;

// receive timeout acts like a general timeout
binding.ReceiveTimeout = TimeSpan.MaxValue;
binding.SendTimeout = TimeSpan.MaxValue;

ChannelFactory<IDatabaseSession> pipeFactory = new ChannelFactory<IDatabaseSession>(binding, new EndpointAddress("net.pipe://localhost/DatabaseService"));

IDatabaseSession dbSession = pipeFactory.CreateChannel()

Each client that I run executes the above code, and for each client the CPU consumption rises by 25% (actually not for client 5., but at the moment the execteable service covers almost 100% of the total CPU power).

What I'm looking for is a kind of resource (website / list or just YOUR powerful knowledge) telling me what CreateChannel actually does (in relation to resource allocation problems).

prompt

: CPU usage is increased, even if communication is not actually performed, only the channel is created.

+5
2

, WCF , , , , . , , . , . InstanceContextMode ConcurrencyMode.

+1

, . , . .

+3

All Articles