How does garbage collection work with several running processes / threads?

I have a Python server program that accepts packages and processes them according to the type of package. To do this, I created several processes using the multiprocessing module. I noticed that garbage collection causes some delay during operation, and packets are not processed for the required time interval. I know how to disable garbage collection:

 import gc gc.disable() 

However, my question is how exactly does Python handle garbage collection when multiple processes or threads are involved? Are there any differences between garbage collection of processes or threads? Do I need to change garbage collection for each process / thread? Or does one change in garbage collection in the parent process / thread also take care of processing for all child processes / threads?

Python 2.7 is used in my current situation, but I would be interested to know if this is the same for Python 2 and Python 3.

+6
source share
1 answer

A process can have multiple threads. Garbage collection works in one process.

On systems that support fork: if you disable garbage collection in one process and then expand it (= create a copy of the process), then the GC must also be disabled in the copy.

If new processes that are not a copy are created, they have their own garbage collection configuration. By default, their GC should be enabled.

But there are many libraries that have a Process class. I can’t say what they are doing. If you use os to create a new process (not a copy), it should start with GC enabled.

The configuration of one GC process does not affect the configuration of another GC. Processes are boundaries for protecting code from another. Thus, everything in one process cannot easily fall into another process.

+3
source

All Articles