The fastest way to read a very large text file in C #

I have a very simple question. I have several text files with data of several GB each. I have a C # WPF application that I use to process similar data files, but nowhere close to this size (probably about 200-300 mb now). How can I efficiently read this data and then write it in another place after processing without any freezes and crashes? In fact, what is the best way to read from a very large file? For my low-scale application, right now I'm using System.IO.File.ReadAllLinesread and streamwriterwrite. I am sure that these 2 methods are not the best idea for such large files. I don't have much experience with C #, any help would be appreciated!

+4
source share
2 answers

If you can do this one at a time, then the answer is simple:

  • Read the line.
  • Process line.
  • Write down the line.

If you want this to accelerate, place those of the three BlockingCollectionswith the indicated upper limit of about 10, so a slower step will never wait for a faster step. If you can output to another physical disk (if output to disk).

The OP changed the rules even after they were asked if the process was line by line (twice).

  • Read the lines to create a unit of work (open to close tags).
  • Production unit of work.
  • Write a unit of work.
+3
source

It may be some overlapping transformation.

https://msdn.microsoft.com/en-us/library/dd997372(v=vs.110).aspx

-, . , , . , "" .

(, ) (, ), 512 .

2 . .

2 . .

. , .

, " " .

.

, , " " . , , , .

, " " .

, . , , .

- , , .

+1

All Articles