I am sentencing to this long message. I did this as little as possible while still maintaining the problem.
Okay, this is driving me crazy. I have a client and server program, both in C #. The server sends data to the client through Socket.Send (). The client receives data through Socket.BeginReceive and Socket.Receive. My pseudo-protocol looks like this: the server sends a two-byte (short) value indicating the length of the actual data that immediately follows the actual data. The client reads the first two bytes asynchronously, converts the bytes to short, and immediately reads that many bytes from the socket are synchronous.
Now it works fine for one cycle every few seconds or so, but when I increase the speed, everything becomes weird. It seems that the client will accidentally read the actual data when it tries to read from a double-byte length. Then he tries to convert these arbitrary two bytes to short, which leads to a completely wrong value, which leads to a failure. The following code is from my program, but trimmed to show only important lines.
Server method for sending data:
private static object myLock = new object(); private static bool sendData(Socket sock, String prefix, byte[] data) { lock(myLock){ try {
Client method of receiving data:
private static object myLock = new object(); private void receiveData(IAsyncResult result) { lock(myLock){ byte[] buffer = new byte[1024]; Socket sock = result.AsyncState as Socket; try { sock.EndReceive(result); short n = BitConverter.ToInt16(smallBuffer, 0);
Server-side code to reliably recreate the problem:
byte[] b = new byte[1020]; // arbitrary length for (int i = 0; i < b.Length; i++) b[i] = 7; // arbitrary value of 7 while (true) { sendData(socket, "PRFX", b); // socket is a Socket connected to a client running the same code as above // "PRFX" is an arbitrary 4-character string that will be sent }
By looking at the code above, you can determine that the server will forever send the number 1024, the length of the general data, including the prefix, in the form of a short (0x400), and then "PRFX" in binary ASCII format, followed by bundle 7 (0x07). The client will read the first two bytes (0x400) forever, interpret them as 1024, store this value as n, and then read 1024 bytes from the stream.
This is really what it does for the first 40 or so iterations, but spontaneously, the client will read the first two bytes and interpret them as 1799, not 1024! 1799 in hexadecimal format - 0x0707, which is two consecutive 7th !!! This is data, not length! What happened to these two bytes? This happens with any value that I put in the byte array, I just chose 7 because it is easy to see the correlation with 1799.
If you are still reading this moment, I welcome your commitment.
Some important observations:
- Reducing the length of b will increase the number of iterations before the problem occurs, but will not prevent the problem from occurring.
- Adding a significant delay between each iteration of the loop can prevent the problem from occurring.
- This is NOT performed when using both the client and the server on the same host and connecting through a loopback address.
As already mentioned, it drives me crazy! I can always solve my programming problems, but this one completely surpassed me. Therefore, I here plead for any advice or knowledge on this matter.
Thanks.