Close the open h5py data file

In our lab, we store our data in hdf5 files through the python h5py package.

At the beginning of the experiment, we create the hdf5 file and save the array after the array of the data array in the file (by the way). When an experiment crashes or is interrupted, the file closes incorrectly. Since our experiments are done with iPython , the reference to the data object remains (somewhere) in memory.

Is there a way to scan all open h5py data objects and close them?

+12
python ipython hdf5 h5py
source share
3 answers

Here's how to do it (I could not figure out how to check if a file is closed without exceptions, maybe you will find):

 import gc for obj in gc.get_objects(): # Browse through ALL objects if isinstance(obj, h5py.File): # Just HDF5 files try: obj.close() except: pass # Was already closed 

Another idea:

How to use files, how to use context manager and with keyword?

 with h5py.File("some_path.h5") as f: f["data1"] = some_data 

When the program stream exits the c-block, the file closes regardless of what happens, including exceptions, etc.

+18
source share

pytables (which uses h5py ) keeps track of all open files and provides an easy way to force close all open hdf5 files.

 import tables tables.file._open_files.close_all() 

This _open_files attribute also has useful methods for providing you with information and handlers for open files.

+2
source share

I found that hFile. bool () returns True if it is open, and False otherwise. This may be the easiest way to check. In other words, do this:

 hFile = h5py.File(path_to_file) if hFile.__bool__(): hFile.close() 
0
source share

All Articles