How to choose the maximum number of threads for an HTTP servlet container?

I am developing a soothing web service that runs as a servlet (using I / O lock) in Jetty. Finding out the optimal setup for maximum flows seems difficult.

Is there an investigated formula for determining the maximum number of flows from some easily measurable characteristics of the rest of the setup?

+7
java optimization concurrency servlets
source share
6 answers

Very simple and primitive:

max_number_of_threads = number_of_CPUs * C

Where C depends on other factors of your application :-)

Ask yourself the following questions:

  • Will your application run CPU intensively (below C) or spend more time on a third system (higher C)?
  • You need faster response time (lower C value) or the ability to serve several different users at once, even if each request takes longer (higher C level).

I usually set C pretty low, for example. 2-10.

+3
source

No no. Make sure that the number of threads is limited and controlled, so you do not exceed system resources, the Java limit is usually about 100-200 live threads.

A good way to do this is to use Executors from java.util.concurrent .

+1
source

I understand that at the time this question was asked, servlet 3.0 did not come out. But I thought that I should write down in this question the possibility of processing Async in the Servlet container using Servlet 3.0. This may help someone who is facing this issue. Needless to say, there are enough resources for Servlet 3.0 that indicate that the main servlet threads are now less under pressure! And Jetty has Async analogues if no one wants to use the Servlet 3.0 API as such.

+1
source

The answer depends on the maximum number of concurrent connections that you are going to handle. You must allow as many threads as you expect.

andreasmk2 is incorrect about the number of threads. I launched applications with 1000 threads and had no problems with system resources; Of course, it depends on the features of your system. You will come across a system constraint, not a Java constraint.

0
source

My problem is that I do not know how to form a reasonable expectation of the number of simultaneous connections. Presumably, at some point it is better to refuse new connections than to allow everything to slow down, because there are too many service requests.

Realistic workloads are hard to model, so I'm looking for a formula already explored by someone else.

(The obvious upper bound is the maximum heap size divided by the minimum amount of memory needed to service the request, but even this is difficult to measure in an garbage collector environment.)

0
source

Thanks. I read this because there was no easy formula .: - (

(My application is an HTML5 validator, sometimes it explicitly waits for external servers. However, it is difficult to determine when it is actually connected to the processor either on its own or through the garbage collector.)

0
source

All Articles