Writing big data to file caching issue

I have a problem when I write a large amount of data <2GB to a file. The first ~ 1.4 GB of data is written quickly (100 MB / s), which makes the code very slow (0-2 MB / s).

My code (simplified):

//FileOptions FILE_FLAG_NO_BUFFERING = (FileOptions)0x20000000;
    FileOptions fileOptions = FileOptions.SequentialScan;

    int fileBufferSize = 1024 * 1024;
    byte[] Buffer = new byte[32768];

    Random random = new Random();
    long fileSize = 2588490188;
    long totalByteWritten = 0;

    using (FileStream fs = File.Create(@"c:\test\test.bin", fileBufferSize, fileOptions))
    {
        while (totalByteWritten < fileSize)
        {
            random.NextBytes(Buffer);
            fs.Write(Buffer, 0, Buffer.Length);
            totalByteWritten += Buffer.Length;
            //Thread.Sleep(10);
        }
    }

I think that there is a problem related to the caching problem, because during the "fast write" the operating memory also increased, when the use of RAM ceases to increase, there is a decrease in performance.

What I tried:

  • change to asynchronous write -> no significant change

  • resize array buffer -> no significant change

  • fileBufferSize - > , ~ 100 , , 0 , , 100 , , ""

  • fileOption WriteThrough - > .

  • xx- fs.Flush(true) - >

  • Thread.Sleep(10) - > .....

+5
1

- , ? ( , , Thread.Sleep , ). , using, , ?

using (FileStream fs = File.Create(@"c:\testing\test.bin", fileBufferSize, fileOptions))
{
  while (fs.Position < fileBufferSize)
  {
    lock(fs) // this is the bit I have added to try to speed it up
    {
      random.NextBytes(Buffer);
      fs.Write(Buffer, 0, Buffer.Length);
    }
  }
}

EDIT: , while, .

, , lock , .

0

All Articles