Permanent Post Python Pool Processes

I have a Python program that takes about 10 minutes to complete. Therefore, I use Poolout multiprocessingto speed up the work:

from multiprocessing import Pool
p = Pool(processes = 6) # I have an 8 thread processor
results = p.map( function, argument_list ) # distributes work over 6 processes!

It works much faster, just from that. God bless Python! And so I thought it would be.

However, I noticed that every time I do this, the processes and the state of their much larger size remain, even when it pgoes beyond the scope; effectively, I created a memory leak. The processes are displayed in my System Monitor application as Python processes that currently do not use the CPU but have significant memory to maintain their state.

The pool has functions close, terminateand join, and I assume that one of them will kill processes. Does anyone know that this is the best way to tell my pool pthat I ended up with it?

Many thanks for your help!

+5
source share
1 answer

From Python docs , it looks like you need to do:

p.close()
p.join()

after map()to indicate that workers should end, and then wait for them to do so.

+5
source

All Articles