I have experienced this in other languages. Now I have the same problem in Python. I have a dictionary in which there are many CRUD actions. It could be assumed that deleting elements from a dictionary should reduce its memory size. This is not true. When a dictionary grows in size (usually doubles), it never (?) Frees the allocated memory back. I conducted this experiment:
import random import sys import uuid a= {} for i in range(0, 100000): a[uuid.uuid4()] = uuid.uuid4() if i % 1000 == 0: print sys.getsizeof(a) for i in range(0, 100000): e = random.choice(a.keys()) del a[e] if i % 1000 == 0: print sys.getsizeof(a) print len(a)
The last line of the first cycle is 6291736 . The last line of the second cycle is 6291736 . And the dictionary size is 0 .
So how to solve this problem? Is there a way to force free memory?
PS: no need to do random actions - I played with the range of the second cycle.
python garbage-collection memory-management dictionary
Schultz9999
source share