5, 100, 1000?
I think "it depends", but on what?
What is common in applications that run as server daemons / services?
What are the strict limits?
Given that the machine can handle the total workload, how do you determine how many threads the overhead starts to affect performance?
What are the important differences between the OS?
What else needs to be considered?
I ask because I would like to use threads in the application to organize subcomponents of my application that do not use data and are designed for parallel operation. Since the application will also use thread pools to parallelize some tasks, I was wondering at what point should I start thinking about the number of threads that will be launched in total.
I know the n + 1 rule as a guideline for determining the number of threads that are simultaneously working on the same task to improve performance. However, I want to use threads, as, for example, you can use processes in a larger area, i. e. organize independent tasks that should not interfere with each other.
In this related question , some people are advised to minimize the number of threads due to additional complexity. It seems to me that threads can also help streamline things and actually reduce clutter. Is not it?
Hanno
source
share