ASP.NET MVC / WEB file loading: memory is not freed after loading

I am exploring a possible memory leak problem in a project where the user uploads files. Files are usually ZIP files or ZIP files that are used for other software. The average file size is 80 MB

There is an MVC application with an interface for downloading files (View). This view sends a POST request for action inside the controller. This controller action retrieves a file using MultipartFormDataContent, similar to this: Sending binary data along with a REST API request and this is: WEB API DOWNLOAD FILE, SINGLE OR MULTIPLE FILES

Inside the action, I get the file and convert it to an array of bytes. After the conversion, I send a mail request to my API with the byte [] array.

Here is the MVC APP code that does this:

[HttpPost] public async Task<ActionResult> Create(ReaderCreateViewModel model) { HttpPostedFileBase file = Request.Files["Upload"]; string fileName = file.FileName; using (var client = new HttpClient()) { using (var content = new MultipartFormDataContent()) { using (var binaryReader = new BinaryReader(file.InputStream)) { model.File = binaryReader.ReadBytes(file.ContentLength); } var fileContent = new ByteArrayContent(model.File); fileContent.Headers.ContentDisposition = new ContentDispositionHeaderValue("attachment") { FileName = file.FileName }; content.Add(fileContent); var requestUri = "http://localhost:52970/api/upload"; HttpResponseMessage response = client.PostAsync(requestUri, content).Result; if (response.IsSuccessStatusCode) { return RedirectToAction("Index"); } } } return View("Index", model); } 

After exploring the use of several memory tools, such as: Best Practices No. 5: Detecting .NET.NET Memory Leaks I found that after converting the file to an array of bytes in this line:

 using (var binaryReader = new BinaryReader(file.InputStream)) { model.File = binaryReader.ReadBytes(file.ContentLength); } 

Memory usage increases from 70 MB + or to 175 MB + or - and even after sending and completing a request, the memory is never freed. If I continue to download files, the memory just increases until the server is completely shut down.

We cannot send files directly from the multipart form to the API, because we need to send and verify some data before (business requirements / rules). After research, I came up with this approach, but the problem with memory leak concerns me.

Am I missing something? Should the garbage collector immediately collect the memory? In all disposable objects, I use the syntax "using", but this does not help.

I am also interested in this approach for uploading files. Should I do anything different?

For clarification purposes only, the API is separate from the MVC application (each of them is hosted on a separate website in IIS), all in C #.

+4
source share
1 answer

1. If the garbage collector immediately takes away the memory?

The garbage collector does not free up memory right away because it is a laborious operation. When garbage collection occurs, all threads controlled by your application are suspended. This results in an unwanted delay. Thus, the garbage collector acts only occasionally, based on a complex algorithm.

2. In all disposable objects, I use the syntax "using", but this does not help.

The using statement refers to unmanaged resources that are in limited numbers (typically associated with IOs, such as file descriptors, databases, and network connections). Therefore, this statement does not affect garbage collection.

3. Am I missing something?

It doesn't seem like you need the original byte array after you wrapped it with ByteArrayContent . You do not clear model.File after packing it, and the array can be passed to the Index view.

I would replace:

 using(var binaryReader = new BinaryReader(file.InputStream)) { model.File = binaryReader.ReadBytes(file.ContentLength); } var fileContent = new ByteArrayContent(model.File); 

from:

 ByteArrayContent fileContent = null; using(var binaryReader = new BinaryReader(file.InputStream)) { fileContent = new ByteArrayContent(binaryReader.ReadBytes(file.ContentLength)); } 

to avoid the need for a clean model.File .

4. If I continue to download files, the memory simply increases until the server is completely lowered.

If your files average 80 MB, they fall into a bunch of large objects. The heap does not compact automatically and usually does not collect garbage. It looks like in your case a large heap of an object grows infinitely (which can happen).

If you use (or can upgrade to) .NET 4.5.1 or later, you can force a large heap of the object to be compressed by setting:

 System.Runtime.GCSettings.LargeObjectHeapCompactionMode = GCLargeObjectHeapCompactionMode.CompactOnce; 

You will need to call this line of code every time you want to schedule the heap compaction of a large object in the next complete garbage collection.

You can also force compression by calling:

 System.Runtime.GCSettings.LargeObjectHeapCompactionMode = GCLargeObjectHeapCompactionMode.CompactOnce; System.GC.Collect(); 

However, if you need to free up a lot of memory, this will be a costly operation in terms of time.

+2
source

All Articles