ObjectContext is a memory leak for individual objects

I already checked this using a memory profiler, and there are no real objects that remain in the hash sets , but hash sets, dictionaries and EntityKey objects, but I have not found a way to disable these links.

Such a simple question: how to stop the context (or its ObjectStateManager) from growing infinitely in size?

[And yes, I know that long life contexts should be avoided, but in this case you need to perform one complex analysis, which requires loading several hierarchical data (and the example below is just a minimal demonstration of the problem), so finally it’s " short "life context with one operation.]

Steps to play:

  • create a new console application
  • create an EF model for the Northwind database (either use some real SQL Server, or copy the Northwind.sdf file from the CD)
  • use the following code:

Code [Updated, no real database connection required]:

class Program { static void Main() { const double MiB = 1024 * 1024; using ( var context = new NorthwindEntities() ) { var last = GC.GetTotalMemory(true) / MiB; Console.WriteLine("before run: {0:n3} MiB", last); var id = 0; while ( true ) { Run(context, ref id); GC.Collect(GC.MaxGeneration, GCCollectionMode.Forced); GC.WaitForPendingFinalizers(); var current = GC.GetTotalMemory(true) / MiB; Console.WriteLine("after run: {0:n3} MiB (+{1:n3} MiB)", current, current - last); last = current; if ( Console.KeyAvailable ) break; Console.WriteLine(new string('-', 100)); } } } static void Run(NorthwindEntities context, ref int id) { for ( int i = 0; i < 100000; i++ ) { var category = new Category { Category_ID = ++id }; category.EntityKey = new EntityKey("NorthwindEntities.Categories", "Category_ID", id); var product = new Product { Product_ID = id, Category_ID = id }; product.EntityKey = new EntityKey("NorthwindEntities.Products", "Product_ID", id); product.Category = category; context.Attach(product); context.Detach(product); context.Detach(category); } var ctr = 0; Console.WriteLine("Enumerating living/attached objects:"); const EntityState AllStates = EntityState.Added | EntityState.Deleted | EntityState.Modified | EntityState.Unchanged; foreach ( var entry in context.ObjectStateManager.GetObjectStateEntries(AllStates) ) Console.WriteLine(" #{0} [{1}] {2}", ++ctr, entry.EntityKey, entry.Entity); if ( ctr == 0 ) Console.WriteLine(" NOTHING (as expected)"); } } 
+8
c # memory-leaks entity-framework
source share
2 answers

Since I only cancel the objects immediately after calling SaveChanges (), now I count the number of individual objects, and when the counter reaches 10,000, I disconnect all the remaining (and necessary) objects from the context and create a new context to which I attach all the individual objects. Disadvantage: the IsLoaded EntityReferences and EntityCollections property is now always false (but I do not rely on this).

+1
source share

I understand that detach is intended to remove context from an object, not from a context. Do not trust the context to remove references to entities (or their insides).

I agree that this leak is a problem, but many people (including me) tried and failed to prevent the infinite growth of EF contexts (as long as requests are executed).

As a suggested solution, perhaps instead of relying on the database as a β€œworkspace” for your calculations, it would be possible to recreate the database structure in your own representation in memory, work on it, and then translate it back to db. You can use temporary files if you have a lot of data.

This should lead to a reduction in the lifespan of contexts.

As an alternative, perhaps consider using something other than EF (at least for processing intensive components), as this may not be appropriate for your situation. Perhaps something lower, like a DataReader, will be more appropriate for your situation.

0
source share

All Articles