I have a problem when I write a large amount of data <2GB to a file. The first ~ 1.4 GB of data is written quickly (100 MB / s), which makes the code very slow (0-2 MB / s).
My code (simplified):
FileOptions fileOptions = FileOptions.SequentialScan;
int fileBufferSize = 1024 * 1024;
byte[] Buffer = new byte[32768];
Random random = new Random();
long fileSize = 2588490188;
long totalByteWritten = 0;
using (FileStream fs = File.Create(@"c:\test\test.bin", fileBufferSize, fileOptions))
{
while (totalByteWritten < fileSize)
{
random.NextBytes(Buffer);
fs.Write(Buffer, 0, Buffer.Length);
totalByteWritten += Buffer.Length;
}
}
I think that there is a problem related to the caching problem, because during the "fast write" the operating memory also increased, when the use of RAM ceases to increase, there is a decrease in performance.
What I tried:
change to asynchronous write -> no significant change
resize array buffer -> no significant change
fileBufferSize
- > , ~ 100 , , 0 , , 100 , , ""
fileOption WriteThrough
- > .
xx- fs.Flush(true)
- >
Thread.Sleep(10)
- > .....