Memory leak using Entity Framework

I have a very simple application using EF. But when it works for a week, memory usage is terrible (80 MB first, 700 MB after one week). When I use dotMemory to profile my application. I find that the memory of the Generation 2 Heap is increasing all the time.

only run 40 minutes

We get a snapshot, and finally find the stored bytes ef dbcontext the most.

enter image description here

I'm so confused. My application is so simple. Code example:

protected CarbonBrushMonitorEntities _entities = new MYEntities(); public void Add(HistoryData data) { _entities.HistoryDatas.Add(data); _entities.SaveChanges(); } 

_entities only starts once at the beginning, and then is used all the time.

Often called the Add function, about 3 times / sec

I have been using Google for a long time and try some methods, such as:

 _entities.Configuration.ValidateOnSaveEnabled = false; _entities.Configuration.AutoDetectChangesEnabled = false; _entities.Configuration.LazyLoadingEnabled = false; 

but they do not work.

+9
c # entity-framework
source share
2 answers

If you are using an entity infrastructure, you must create a context just before you need it and remove it as soon as possible :

  using (var someContext = new SomeContext()) { // your commands/queries } 

Never keep a context in memory or use it for different calls.

I usually register a context with an IoC container:

  DependencyFactory.RegisterType(typeof(SomeContext)); 

and use a context resolver (also registered in IoC, of โ€‹โ€‹course), for example:

  using (var someContext = _contextResolver.ResolveContext()) { // your commands/queries } 

where resolution is executed:

  public class ContextResolver : IContextResolver { public ISomeContext ResolveContext() { return DependencyFactory.Resolve<SomeContext>(); } } 

The EF context is actually your unit of work that should be removed when you no longer need it.

+20
source share

Another way is to clear the change tracking system of the relevant organizations or even all individuals. This is done by changing the state of the object to "Separate." This is called after dbContext.SaveChangesAsync()

 protected void DisposeDbset<T>() where T : class { var Tname = typeof(T).Name; var changetrackercollection = _unitOfWork.dbContext.ChangeTracker.Entries<T>(); foreach (var item in changetrackercollection.ToList()) { item.State = EntityState.Detached; } GC.Collect(); } 

I recently encountered a similar situation when I inserted 3,000,000 rows in a batch operation. After inserting the rows, the change tracking information for all rows remained in memory with the state of the object as โ€œNo changesโ€. Therefore, after each call to SaveChangesAsync() , change tracking accumulated.

I could not allow a new instance of dbcontext for each package, as it was a more expensive operation.

Just for your information, I set up dbConetext.ChangeTracker.QueryTrackingBehavior = NoTracking . But this applies to data acquisition.

Hope this is helpful. I found my solution at this link http://andreyzavadskiy.com/2016/09/23/entries-in-entity-framework-changetracker-could-degrade-database-write-performance/?unapproved=19301&moderation-hash=4acc61a32ead7232959c2ec1ca268180# comment-19301

0
source share

All Articles