An exception of type "System.OutOfMemoryException" is thrown

I mainly use the Entity Framework to query a huge database. I want to return a list of strings and then write it to a text file.

List<string> logFilePathFileName = new List<string>(); var query = from c in DBContext.MyTable where condition = something select c; foreach (var result in query) { filePath = result.FilePath; fileName = result.FileName; string temp = filePath + "." + fileName; logFilePathFileName.Add(temp); if(logFilePathFileName.Count %1000 ==0) Console.WriteLine(temp+"."+logFilePathFileName.Count); } 

However, I got an exception when logFilePathFileName.Count=397000 . The exception is:

An exception of type 'System.OutOfMemoryException' was thrown.

The first random exception of type "System.OutOfMemoryException" occurred in System.Data.Entity.dll

UPDATE:

What I want to use for another query, say: select the top 1000, then add to the list, but I don’t know after 1000, and then what?

+7
source share
6 answers

Most likely, this is not about RAM as it is, so increasing your RAM or even compiling and running your code on 64 bit machine will not have a positive effect, in this case.

I think this is due to the fact that .NET collections are limited to a maximum size of 2GB RAM (no difference is 32 or 64 bit).

To solve this problem, split your list into smaller chunks and most likely your problem will go away.

Only one possible solution:

 foreach (var result in query) { .... if(logFilePathFileName.Count %1000 ==0) { Console.WriteLine(temp+"."+logFilePathFileName.Count); //WRITE SOMEWHERE YOU NEED logFilePathFileName = new List<string>(); //RESET LIST !| } } 

EDIT

If you need a snippet request , you can use Skip(...) and Take(...)

Just an explanatory example:

 var fisrt1000 = query.Skip(0).Take(1000); var second1000 = query.Skip(1000).Take(1000); 

... etc.

Naturally put it on your iteration and parameterize it based on the limitations of the data you know or need.

+13
source

Why are you collecting data in a List<string> if all you have to do is write it to a text file?

You could simply:

  • Open the text file;
  • Iterate over records by adding each line to a text file (without storing lines in memory);
  • Reset and close the text file.

You will need much less memory than now, because you will not store all these lines without the need for memory.

+3
source

You probably need to install some vmargs for memory! Also ... write it directly to your file and do not keep it in the list

+1
source

What Roy Dict says sounds the best. You can also try adding a limit to your request. Thus, the result of your database will not be so big.

Information about: Limiting the size of a query using an entity structure

+1
source

You do not have to read all records from the database to the list. It required a lot of memory. You combine reading records and writing them to a file. For example, read 1000 entries from db into a list and save (add) them to a text file, clear the used memory (list.Clear ()) and continue with new entries.

0
source

From several other topics in StackOverflow, I read that the Entity Framework is not designed to handle massive data like this. EF will cache / track all data in context and throw an exception in case of huge data arrays. Parameters should use SQL directly or split your records into smaller sets.

0
source

All Articles