List memory usage in ipython and jupyter

I have several (almost ten) Gb of memory occupied by the ipython kernel. I think this comes from large objects (matrices, lists, numpy arrays, ...) that I could create during some operation, and now I no longer need to.

I would like to list all the objects that I defined and sort them by their memory. Is there an easy way to do this? There is an nbytes method for certain types, but not for everyone ... so I'm looking for a general way to list all the objects that I created and their memory.

+7
python numpy memory ipython jupyter-notebook
source share
1 answer

Assuming you are using ipython or jupyter , you will need to work a bit to get a list of all the objects that you defined. This means that everything is available in globals() , and filtering modules , builtins , ipython objects , etc. Once you are sure that you have these objects, you can proceed to capture their sizes using sys.getsizeof . This can be summarized as follows:

 import sys # These are the usual ipython objects, including this one you are creating ipython_vars = ['In', 'Out', 'exit', 'quit', 'get_ipython', 'ipython_vars'] # Get a sorted list of the objects and their sizes sorted([(x, sys.getsizeof(globals().get(x))) for x in dir() if not x.startswith('_') and x not in sys.modules and x not in ipython_vars], key=lambda x: x[1], reverse=True) 

Please keep in mind that for python objects (created using python built-in functions) sys.getsizeof will be very accurate. But this may be a little inaccurate with respect to objects created using third-party libraries. Also, remember that sys.getsizeof adds extra garbage collector overhead if the object is managed by the garbage collector. Thus, some things may look a little harder than they really are.

As a side note, the numpy .nbytes method can be somewhat misleading in that it does not include the memory consumed by the non-element attributes of the array object.

Hope this helps.

+15
source share

All Articles