I am working on a system requiring high performance file I / O (from C #). Basically, I fill in large files (~ 100 MB) from the very beginning of the file to the end of the file. Every ~ 5 seconds I add ~ 5 MB to the file (sequentially from the very beginning of the file), on each array I flush the stream. Every few minutes I need to update the structure that I write at the end of the file (some metadata).
When flushing each package I have no performance issues. However, when updating the metadata at the end of the file, I get very poor performance. I assume that when creating the file (which should also be done very quickly), the file does not actually allocate all 100 MB on the disk, and when I clear the metadata, it should allocate all the space to the end of the file.
Guys / Girls, any idea how I can overcome this problem?
Thank you so much!
From the comment:
In general, the code is as follows: first the file is opened:
m_Stream = new FileStream(filename, FileMode.CreateNew, FileAccess.Write, FileShare.Write, 8192, false); m_Stream.SetLength(100*1024*1024);
Every few seconds I write ~ 5 MB.
m_Stream.Seek(m_LastPosition, SeekOrigin.Begin); m_Stream.Write(buffer, 0, buffer.Length); m_Stream.Flush(); m_LastPosition += buffer.Length;
performance c # file-io
Lior ohana
source share