Here is my code that opens the XML file (old.xml), filters the invalid characters and writes to another XML file (abc.xml). Finally, I will reload the XML (abc.xml). When executing the followling line, there is an exception indicating that the xml file is being used by another process,
xDoc.Load("C:\\abc.xml");
Does anyone have any idea what is wrong? Any leaks in my code and why (I always use the keyword "using", confused to see leaks ...)?
Here is my whole code, I am using C # + VSTS 2008 under Windows Vista x64.
// Create an instance of StreamReader to read from a file. // The using statement also closes the StreamReader. Encoding encoding = Encoding.GetEncoding("utf-8", new EncoderReplacementFallback(String.Empty), new DecoderReplacementFallback(String.Empty)); using (TextWriter writer = new StreamWriter(new FileStream("C:\\abc.xml", FileMode.Create), Encoding.UTF8)) { using (StreamReader sr = new StreamReader( "C:\\old.xml", encoding )) { int bufferSize = 10 * 1024 * 1024; //could be anything char[] buffer = new char[bufferSize]; // Read from the file until the end of the file is reached. int actualsize = sr.Read(buffer, 0, bufferSize); writer.Write(buffer, 0, actualsize); while (actualsize > 0) { actualsize = sr.Read(buffer, 0, bufferSize); writer.Write(buffer, 0, actualsize); } } } try { XmlDocument xDoc = new XmlDocument(); xDoc.Load("C:\\abc.xml"); } catch (Exception ex) { Console.WriteLine(ex.Message); }
EDIT1: I tried resizing the buffer from 10M to 1M and it works! I'm so confused, any ideas?
EDIT2: I find this problem very easy to reproduce when the input old xml file is very large, like 100M or something like that. I suspect this is a known .Net bug? I am going to use tools like ProcessExplorer / ProcessMonitor to find out which process is blocking the file so that it does not access XmlDocument.Load.
c # file memory-leaks
George2
source share