I am writing an application that involves writing significantly larger chunks of data to an OutputStream (owned by Socket). The thing is, this is a bit complicated, because there are usually several streams trying to record the same OutputStream. I am currently designed in such a way that the OutputStream to which data is being written is in its stream. The stream contains a queue (LinkedList) that checks byte arrays and writes them as soon as possible.
private class OutputStreamWriter implements Runnable { private final LinkedList<byte[]> chunkQueue = new LinkedList<byte[]>(); public void run() { OutputStream outputStream = User.this.outputStream; while (true) { try { if (chunkQueue.isEmpty()) { Thread.sleep(100); continue; } outputStream.write(chunkQueue.poll()); } catch (Exception e) { e.printStackTrace(); } } } }
The problem with this design is that as more and more records occur, there is more and more data queue, and it does not get written faster. Initially, when data is queued, it is written almost immediately. Then after about 15 seconds, the data begins to lag; the delay develops from the moment the data is queued from the time the data was actually written. Over time, this delay becomes longer and longer. This is very noticeable.
A way to fix this would be some implementation of ConcurrentOutputStream, which allows you to send data without blocking so that records do not start with a backup (hell, the queue would then be unnecessary). I don’t know if there is such an implementation - I couldn’t find it - and personally I don’t think it is even possible to write it.
So, does anyone have any suggestions on how I can redesign it?
source share