So, is "merging" really just writing files one by one? It's quite simple - just open one output stream and then reopen the input stream, copy the data, close it. For example:
static void ConcatenateFiles(string outputFile, params string[] inputFiles) { using (Stream output = File.OpenWrite(outputFile)) { foreach (string inputFile in inputFiles) { using (Stream input = File.OpenRead(inputFile)) { input.CopyTo(output); } } } }
This is the use of the Stream.CopyTo method, which is new in .NET 4. If you are not using .NET 4, you will need another helper method:
private static void CopyStream(Stream input, Stream output) { byte[] buffer = new byte[8192]; int bytesRead; while ((bytesRead = input.Read(buffer, 0, buffer.Length)) > 0) { output.Write(buffer, 0, bytesRead); } }
Nothing that I know is more efficient than that ... but, importantly, it does not take up much memory on your system at all. It does not seem like he repeatedly read the entire file in memory, and then wrote it all again.
EDIT: As pointed out in the comments, there are ways you can play with file parameters to make them somewhat more efficient in terms of what the file system does with data. But basically, you will read the data and write it, with the buffer at a time, anyway.
Jon skeet
source share