Most efficient way to read a serialized BinaryFormatter object from NetworkStream?

I have an application that sends serializable objects of different sizes over a socket connection, and I would like it to be as scalable as possible. There may also be tens to hundreds of compounds.

  • NetworkStream comes from TcpClient, which constantly listens for incoming messages.
  • I do not want to block the stream with standard NetworkStream.Read (). It needs to be scaled. I assume that Read () is blocking because this is pretty standard behavior for such a class of classes, as well as the ReadTimeout property in the class.
  • I'm not sure if BinaryFormatter just uses Read (), or if it does some Async stuff for me under the hood. My guess is no.
  • TcpClient needs to receive a message, read it to the end, and then return to listening to messages.

So, it seems that there are too many ways to fool this cat, and I'm not sure what will actually be the most effective. I have:

Just use BinaryFormatter to read NetworkStream?

var netStream = client.GetStream(); var formatter = new BinaryFormatter(); var obj = formatter.Deserialize(netStream); 

OR Do some magic with new asynchronous / pending things:

 using(var ms = new MemoryStream()) { var netStream = client.GetStream(); var buffer = new byte[1028]; int bytesRead; while((bytesRead = await netStream.ReadAsync(buffer, 0, buffer.length)) > 0) { ms.Write(buffer, 0, buffer.Length); } var formatter = new BinaryFormatter(); var obj = formatter.Deserialize(ms); } 

OR As in the previous example, only using the new CopyToAsync method:

 using(var ms = new MemoryStream()) { var netStream = client.GetStream(); await netStream.CopyToAsync(ms); //4096 default buffer. var formatter = new BinaryFormatter(); var obj = formatter.Deserialize(ms); } 

OR something else?

I am looking for an answer that provides maximum scalability / efficiency.

[Note: above is all PSUEDO code given as examples]

+6
source share
4 answers

The first approach had a problem with large threads. If you ever send big data, this code will explode the application with an exception to memory.

The second approach looks very good - it is asynchronous (this means that you are not using some important threads to wait for the read to complete), and it uses pieces of data (this is how you should work with the stream).

So, we move on to the second option, perhaps with a slight modification - we deserialize only a piece of data at a time, do not read all this (if you are not sure about the length of the stream).

This is what I mean (pseudo code)

 using (var networkStream = client.GetStream()) //get access to stream { while(!networkStream.EndOfStream) //still has some data { var buffer = new byte[1234]; //get a buffer await SourceStream.ReadAsync(result, 0, buffer); //read from network there //om nom nom buffer Foo obj; using(var ms = new MemoryStream()) //process just one chunk { ms.Write(buffer, 0, buffer.Length); var formatter = new BinaryFormatter(); obj = formatter.Deserialize(ms); //desserialise the object } // dispose memory //async send obj up for further processing } } 
+4
source

Asynchronous / waiting material will allow you to block threads less often when waiting for resources, so in general it will scale better than thread blocking versions.

+2
source

Async will scale better if there are hundreds of simultaneous operations .

However, it will be slower. Async has overheads that are easily detected in tests. Prefer to use option 1 if you do not require option 2.
+2
source

I thought it was also worth mentioning that there is a difference between the transition async vs sync from the point of view of the client. If you go to asynchronous ... everyone will experience the same response time. Therefore, if all your requests are intense, everyone will understand a slower response time. With synchronization requests, users with light requests will be processed much faster, as they will not be supported by other users. However, if you have many simultaneous requests in a synchronous environment, it is ultimately possible that all your threads will be blocked and the requests will not receive a response.

+1
source

All Articles