I have a Windows C # service acting as a server, the service contains some large (> 8Gb) in-memory data structures and provides remote search methods for clients.
The avg search operation is performed at <200ms, and the service processes up to 20 requests / sec.
I often notice severe performance degradation (> 6000ms) within seconds
My best guess is that server threads from time to time are stopped by gen2 garbage collection.
I am considering switching from a gc server to a gc workstation and overlaying a search method on it to prevent GC during requests.
static protected void DoLowLatencyAction(Action action) { GCLatencyMode oldMode = GCSettings.LatencyMode; try { GCSettings.LatencyMode = GCLatencyMode.LowLatency;
Is that a good idea?
Under what conditions will the GC be executed anyway inside the low latency block?
Note. I am running an x64 server with 8 cores
thanks
source share