How do you minimize performance when upgrading to EF 4.1 from LINQ to SQL?

I recently upgraded my application from LINQ to SQL and SQL Server CE 3.5 to Entity Framework 4.1 Code First and SQL Server CE 4.0, and now it works much slower. I did some before and after testing the stopwatch, and most of the basic operations of my application seem to work an average of 40% slower.

I use all the default strategies and configurations for EF Code First, with the exception of disabling cascading deletes.

When I initially posted this question, I focused on one query, which seemed to take a particularly long time, but since then I realized that it was especially slow on the first start (see comment below).

Now I see that most requests are slower - not much slower, but slow enough to add up quickly, since most of the operations performed by the application are related to several requests.

This application has a very small database. The SQL CE file (.sdf) is only 458 KB, and the largest table has less than 250 entries.

Here is an example of a POCO class:

public class Target { public int Id { get; set; } public int TrialDefinitionId { get; set; } public int Number { get; set; } public int X { get; set; } public int Y { get; set; } public string Phase { get; set; } public virtual TrialDefinition TrialDefinition { get; set; } } 

All my classes follow this basic pattern (simple types + virtual properties to get objects bound by foreign keys). I have one class that uses ICollection to get a list for a many-to-one relationship.

Final note. I use the repository template as an intermediary, and each repository use is put in the using block. For get operations, this causes the objects to become disconnected after receiving the data that I need from the database.

Does anyone have any specific strategies for improving the performance of my EF Code First app? . Keep in mind that I have not yet had the opportunity to read EF again. I just try to migrate from LINQ to SQL to EF as quickly and painlessly as possible. The most useful answer for me would be to change specific strategies or configurations or other settings.

+3
c # sql-server-ce linq-to-sql entity-framework code-first
source share
4 answers

Final note: I use the repository template as an intermediary, and each repository use is put into the use block. For get operations, this causes the objects to become disconnected after receiving the data that I need from the database.

Well, this is not required ...

  • The default architecture of the Entity Framework already implements the repository template.
  • Saving an ObjectContext does not mean that you are saving your database connection.
  • Only when you load or save changes to the database, a new connection is captured from the connection pool and an operation is performed.

Of course, the use of a block is slowed down, because each used block will do the following,

  • Initialize context (requires loading metadata from resources)
  • Confirm a few things
  • Open DB Connection
  • Complete your tasks
  • Database cleanup and shutdown

Now, the first two steps will probably take a long time, and you will have many objects of the same type that will live longer in your application, because each new context will create a new copy of the same object for each request.

Entity Framework already implements Identity Map, which means that it will keep the object alive and only one copy of the object for the same primary key throughout the life cycle of the context, which will not only save memory, but will also work faster.

I would advise against using blocks for each request or smaller steps, but rather, you should keep your ObjectContext alive throughout the life of your application. And you don’t need to implement caching or repository at all.

+5
source share

Read here and here about the inner workings of the Entity framework. This is due to EFv4 and the ObjectContext API, but EFv4.1 with the DbContext API just wraps EFv4.

If you think your request is slow, try running it twice in the same context and twice in different instances of the context. The first test will check whether the problem is in the materialization of the object, because the objects will be implemented only for the first request, and the second test will check if there are any problems with initializing the context (this should not happen if you use standard context creation with connecting string).

It would also be interesting to compare the execution with the compiled request, but I have a feeling that the compiled requests are not part of the DbContext API.

0
source share

When you changed to Code First, did it change the database structure?

My guess is yes, and that is what causes the performance change.

I also noticed that in your class you have:

 public int TrialDefinitionId { get; set; } 

and

 public virtual TrialDefinition TrialDefinition { get; set; } 

Are both of them required?

0
source share

All Articles