I have a Python program that takes about 10 minutes to complete. Therefore, I use Poolout multiprocessingto speed up the work:
from multiprocessing import Pool
p = Pool(processes = 6)
results = p.map( function, argument_list )
It works much faster, just from that. God bless Python! And so I thought it would be.
However, I noticed that every time I do this, the processes and the state of their much larger size remain, even when it pgoes beyond the scope; effectively, I created a memory leak. The processes are displayed in my System Monitor application as Python processes that currently do not use the CPU but have significant memory to maintain their state.
The pool has functions close, terminateand join, and I assume that one of them will kill processes. Does anyone know that this is the best way to tell my pool pthat I ended up with it?
Many thanks for your help!
source
share