I am trying to open large .csv files (16k lines +, ~ 15 columns) in a python script, and I have some problems.
I use the built-in open () function to open a file and then declare csv.DictReader using the input file. The cycle is structured as follows:
for (i, row) in enumerate(reader):
However, if I use a file longer than 20 lines, the file will open, but in a few iterations I will get a ValueError: I / O operation in a closed file.
My thought is that I may have a shortage of memory (although the 16k file has only 8 MB and I have 3 GB of RAM), in which case I expect that I will need to use some kind of buffer to load only file sections to memory at a time.
Am I on the right track? Or could there be other reasons for the file closing unexpectedly?
edit: in about half the cases, I run this with csv of 11 lines, this gives me a ValueError. An error does not always occur on the same line.
source share