Python memory usage: which of my objects clogs most of the memory?

The program I wrote stores a large amount of data in dictionaries. In particular, I create 1588 class instances, each of which contains 15 dictionaries with 1500 float for floating point comparisons. This process pretty quickly uses 2 GB of memory on my laptop (I'm starting to write to share with about the 1000th instance of the class).

My question is this: which of the following functions uses my memory?

  • 34 million some pairs of floats?
  • Overhead for 22,500 dictionaries?
  • overhead of 1,500 classes?

It seems to me that the memory of hog should be a huge amount of floating point numbers that I keep in mind. However, if what I have read so far is true, each of my floating point numbers occupies 16 bytes. Since I have 34 million pairs, it should be about 108 million bytes, which should be a little more than a gigabyte.

Is there something I am not considering here?

+5
source share
1 answer

Floats occupy 16 bytes apiece, and dict - 1500 entries about 100k:

>> sys.getsizeof(1.0)
16
>>> d = dict.fromkeys((float(i) for i in range(1500)), 2.0)
>>> sys.getsizeof(d)
98444

so 22 500 dictons themselves will take 2 GB, and 68 million will float another GB. Not sure how you calculate 68 million times 16, equal to only 100M - you could reset zero somewhere.

, 1500 ( , , , , getsizeof dicts) , dict , . :.

>>> sys.getsizeof(Sic)
452
>>> sys.getsizeof(Sic())
32
>>> sys.getsizeof(Sic().__dict__)
524

452 , (524 + 32) * 1550= 862K , , , dicts floats.

+7

All Articles