I am trying to read some text files where each line should be processed. At the moment, I just use StreamReader, and then every time I read every line.
I am wondering if there is a more efficient way (in terms of LoC and readability) to do this using LINQ without sacrificing performance. The examples I saw include loading the entire file into memory and then processing it. In this case, however, I do not think that would be very effective. In the first example, files can get up to about 50 thousand, and in the second example, not all lines of the file should be read (sizes are usually 10 KB).
You can argue that this does not really matter for these small files at the moment, however I believe that this approach leads to inefficient code.
First example:
// Open file using(var file = System.IO.File.OpenText(_LstFilename)) { // Read file while (!file.EndOfStream) { String line = file.ReadLine(); // Ignore empty lines if (line.Length > 0) { // Create addon T addon = new T(); addon.Load(line, _BaseDir); // Add to collection collection.Add(addon); } } }
Second example:
// Open file using (var file = System.IO.File.OpenText(datFile)) { // Compile regexs Regex nameRegex = new Regex("IDENTIFY (.*)"); while (!file.EndOfStream) { String line = file.ReadLine(); // Check name Match m = nameRegex.Match(line); if (m.Success) { _Name = m.Groups[1].Value; // Remove me when other values are read break; } } }
c # linq line
Luca Spiller Aug 13 '09 at 10:41 2009-08-13 10:41
source share