Python process growing memory over time

My python process memory is dynamically increasing as it stores dynamic data in a list, dictionary, and tuples where necessary. Although after that all this dynamic data is physically cleared by its variables, the memory is not reset.

Therefore, I felt that there was a memory leak, and I used the gc.collect () method to collect all free memory. But I could not make the memory minimal when there was no data in the variables.

+7
python memory-management
Jun 22 2018-10-06T00:
source share
2 answers

It is very difficult, in general, for the process to "return memory back to the OS" (until the process is complete and the OS returns all memory, of course), because (in most cases, the implementation) that returns malloc cut from large blocks for efficiency, but the entire block cannot be returned if any part of it is still in use - which is why most standard C libraries do not even try.

For a worthy discussion in the context of Python, see, for example, here . Evan Jones fixed some problems with Python, as described here and here , but his patch is in the trunk with Python 2.5, so the problems you observe are definitely related to the system malloc package and not Python as such. Designation 2.6 is here and here .

A SO thread is here , where Hugh Allen quotes Firefox programmers in his answer that Mac OS X is a system in which it is basically impossible for the process of returning memory to the OS.

So, only completing the process, you can free his memory. For example, a long-term server may from time to time delete its state to disk and close it (using a tiny watchdog timer, system or custom, watch it and restart it). If you know that the next operation will take a long time for a short time, often you can os.fork , do holographic work in the child process and get the results (if any) to the parent process through when the child process ends. And so on and so forth.

+6
Jun 23 2018-10-10T00:
source share

How big are we talking? Python itself takes up some memory. Perhaps up to 30 or 40 MB. If it is bigger than that, and it is not going to, you have a memory leak. You can only collect garbage without links, somehow your excess material is still referenced. Make a memory profile and see what happens.

0
Jun 22 2018-10-21T00:
source share



All Articles