I have a boat of images in an hdf5 file that I would like to download and analyze. Each image is 1920x1920 uint16 and loading all of them into memory crashes the computer. I was told that others are working around this, slicing an image, for example. if the data is 1920x1920x100 (100 images), then they are read in the first 80 lines of each image and analyze this slice, and then move on to the next fragment. This can be done without problems, but when I try to create a dataset in hdf5 file, it gets TypeError: cannot convert element 0 ... to hsize_t
I can recreate the problem with this very simplified code:
with h5py.File('h5file.hdf5','w') as f: data = np.random.randint(100, size=(15,15,20)) data_set = f.create_dataset('data', data, dtype='uint16')
which gives the result:
TypeError: Can't convert element 0 ([[29 50 75...4 50 28 36 13 72]]) to hsize_t
I also tried to exclude "data_set =" and "dtype = 'uint16", but I still get the same error. Then the code:
with h5py.File('h5file.hdf5','w') as f: data = np.random.randint(100, size=(15,15,20)) f.create_dataset('data', data)
Can someone give me some hints that the problem is? Hooray!
Donmp source share