I have a pretty common socket server in C # that uses asynchronous methods of socket classes - BeginAccept (), BeginReceive (), etc. This server has been working fine for the past 4 years on many client sites running Win Server 2003. Recently, I installed it on a 64-bit Windows Server 2008 R2 server. Everything looks fine until the first client connects and calls BeginReceive () and BeginAccept () in the accept handler. When this happens, the CPU utilization reaches 100% and stays that way until I close the socket for listening.
Not sure if this is important, but the server is running on a virtual machine.
We have done many tests, but nothing helps. Using the Process Explorer, I see that two of the two threads are deployed immediately after the calls to BeginReceive () / BeginAccept (), and they are the ones that consume the processor. Unfortunately, I cannot reproduce this problem on my 64-bit Win7 workstation.
I have done a lot of research, and all I have found so far is the following two KB articles, which imply that Server 2008 R2 may have problems with TCP / IP components, but they are only available as hot fixes: KB2465772 and KB2477730. I do not want my client to install them until I am sure that they will fix the problem.
Has anyone else had this problem? If so, what did you need to do to solve this problem?
Here is a way that I think causes the situation:
private void AcceptCallback(IAsyncResult result) { ConnectionInfo connection = new ConnectionInfo(); try {
source share