This is not a complete answer, but if you have a 20 GB csv file, you will need 20 GB + to store all this in memory at once if your reader does not save everything compressed in memory (unlikely). You need to read the file in chunks, and the solution you use to put everything in an array will not work unless you have a huge amount of bar.
You will need a loop more similar to this:
CsvReader reader = new CsvReader(filePath) CSVItem item = reader.ReadNextItem(); while(item != null){ DoWhatINeedWithCsvRow(item); item = reader.ReadNextItem(); }
C # memory management will then be smart enough to get rid of old CSVItems when you go through them, provided that you don't keep links to them hanging around.
The best version would be to read a fragment from CSV (for example, 10,000 lines), process it all, then get another fragment or create a task for DoWhatINeedWithCsvRow if you don't care about the processing order.
source share