I get the feeling that I'm missing something really obvious here.
The general structure of my system makes me want to use a blocking DatagramChannel without Selectors to make everything simple. I am trying to do timeout processing by setting a timeout on a socket, but this does not seem to have an effect.
This pseudo-stated code gives a hint about what I'm trying to achieve.
DatagramChannel channel = DatagramChannel.open ();
channel.socket (). bind (some address);
channel.socket (). setSoTimeout (3000);
channel.send (outBuffer, peerAddress);
channel.receive (inBuffer);
On the other hand, I have a UDP server that gives five quick answers, and then, for testing, it delays for about five seconds before delivering the sixth answer.
The delay does not raise a SocketTimeoutException. Why is this? The timeout set on the socket does not seem to be taken into account when calling channel.receive.
Regards, Fredrick
source share