How to convert fixed size to unlimited netcdf file

I download daily 600 MB netcdf-4 files that have this structure:

netcdf myfile {
dimensions:
        time_counter = 18;
        depth = 50;
        latitude = 361;
        longitude = 601;
variables:
        salinity
        temp, etc     

I am looking for the best way to convert a time_counter dimension from a fixed size (18) to an unlimited dimension.

I found a way to do this with the netcdf and sed commands. Like this:

ncdump myfile.nc | sed -e "s # ^. time_counter = 18; #time_counter = UNLIMITED; // (currently 18) #" | ncgen -o myfileunlimited.nc

which worked for me for small files, but when dumping 600 MB Netcdf files it takes a lot of memory and time.

Does anyone know a different way to do this?

+4
3

. ncdump-sed-ncgen-, , netcdf, 600 , 5 ( CDL). netcdf, .

NCO , ncks "--mk_rec_dmn". Ncks netcdf, , myfile.nc ( ), "-mk_rec_dmn", .

ncks --mk_rec_dmn time_counter myfile.nc -o myfileunlimited.nc ; mv myfileunlimited.nc myfile.nc 

( ) .

ncks --fix_rec_dmn time_counter myfile.nc -o myfilefixedsize.nc ; mv myfilefixedsize.nc myfile.nc
+9

, sed , , , .

, ncdump ncgen NetCDF.

dump + gen , , NetCDF .

, , , . , , , NetCDF.

, NetCDF-4 HDF5 . , _netcdf_dim_info _netCDF (, , ).

, , time_counter UNLIMITED ( 0), , , :

" , HDF5 netCDF-4".

, , , , . Bulldozer 78- 20 , 500 ncgen (1 ) 12 ncdump (111 ), , .

/ 10 , , , CPU, , , + .

, , sed , , , 1,5 .

+2

xarray python xr.to_netdf(), Dask.

, unlimited_dims, chunks . :

import xarray as xr
ds = xr.open_dataset('myfile.nc', chunks={'time_counter': 18})
ds.to_netcdf('myfileunlimited.nc', unlimited_dims={'time_counter':True})

Dask xarray .

+1

All Articles