Lazy versus Entity Framework high performance downloads

So, I have the following model classes in my DbContext:

Loans

Every time I create a list of LoanApplication objects, I do something like this:

var context = new MyContext(); var applications = context.LoanApplications.Where(d => d.PropertyThatIWantToFilter = localVariable); 

This returns an IQueryable, which is then converted to a ViewModel, as happens when the controller method is called:

 var vm = applications.Select(d => new LoanApplicationViewModel(d)); 

Inside the LoanApplicationViewModel constructor LoanApplicationViewModel I accept an entity object and perform the appropriate mapping. The fact is that since the Solicitors collection is a navigation property, a call is made to the database every time a new presentation model is created. The average number of lawyers per application is two, so this means that if I create a table listing the last 10 applications, the application makes about 18-20 trips to the database.

I thought there should be a better way to get this collection, so I modified my initial request to look forward to downloading the collection as follows:

 var applications = context.LoanApplications.Include("Solicitors").Where... 

Although this reduced the number of database calls by only one, the query was much slower, about 50% slower.

The database is hosted on SQL Azure, and we applied Transient Fault Handling, but I want to reduce the number of calls made to the database, without compromising response performance.

What is the best practice here?

+9
c # asp.net-mvc entity-framework code-first azure-sql-database
Apr 03 '13 at 3:29
source share
5 answers

"What is the best practice here?"

Best practice is

  • kit! the application is wide! performance
  • profile, reference point and definition of the neck of the bottle
  • Review and fine-tuning the neck of the bottle, which gives you maximum performance for the least work. (and in my experience 90% of the time this is not tsql)

Now this may seem a little inappropriate, but from this point of view, which always loads the template that PROFILED is optimal in your application domain, is the right way.

There is no โ€œbest practiceโ€ impatient / lazy. That is why both options are available. Also, if tsql is your bottleneck, and the transition between impatient / lazy still does not affect your target performance, you will need to use many other tools, such as a query analyzer and query plan analyzer in SSMS.




For background:

I searched for "slow boot" and came here. Here is my result:

 var foo = _context.Foos //.Include("Answers") //.Include("Attachments") .FirstOrDefault(q => q.Id == key); 

Pending Download: 106 ms

Lazy loading: 11ms + 5ms + 5ms

Lazy download victories, end of story.

+13
Nov 11 '13 at 0:22
source share

In addition to SQL statements that give huge results or a lot of challenges when using both impatient and lazy, there is a huge amount of work done by placing and matching in ObjectContext / DbContext from the result. This leads to a huge performance hit, and I can't really recommend them when extracting a lot of data.

The best solution is to specify an explicit call to Select. However, itโ€™s a little difficult to give you an example of how to do this without knowing how your viewmodel is created. So, what I'm doing here gives you an example that uses an anonymous object as a result of the request.

This example shows contacts with information about the customer to whom the contact belongs.

 var contacts = context.Contacts.Where(row => row.CategoryId == 1) .Select(row => new { ContactId = row.Id, Name = row.Name, CustomerName = row.Customer.Name }).ToList(); 

This query will generate an SQL SELECT that connects the Contacts to the Client using an internal join, and then only select the Contact.Id, Contact.Name and Customer.Name columns.

This solution is the most effective way to receive data from the server if you are not going to work with the data and save the changes directly in the same context. It uses neither impatient nor lazy loading.

+4
Apr 04 '13 at 18:59 on
source share

If you could somehow query the table of your solicitors and filter the query using the already selected list of applications, then the selected objects will be cached in your context, and I believe that it will be used for the navigation property, and not for accessing the database.

I'm not sure exactly how to write requestors requesting a request, but I was thinking something like this

 int[] applicationIDs = applications.Select(x => x.ID).ToArray(); var solicitors = context.Solicitors.Where(x => x.Applications.Any(y => applicationIDs.Contains(y.ID))).ToArray(); // added toarray to cause execution cause im never sure when the LINQ actually runs 
0
Apr 03 '13 at 3:44 on
source share

Do you consider using sql view?

I'm not quite sure about Azure Sql. However, in sql server, you may have performance limitation when joining two tables without proper indexes. Perhaps this will happen in your request.

It should be noted that your query before the query refers to 1 table with a where clause, 2 calls. And in the after query, he accesses two tables with a where clause, 1 call. In your subsequent query there is a connection and you probably need a different index.

You can create a sql view to make sure that the correct index is being used. Then make your application invoke the view. A stored procedure can also be used for this purpose, but it is less suitable for this.

0
Apr 03 '13 at 4:39
source share

Busy loading retrieves redundant master data. It will take a lot of memory, although the object graph in the context stores only separate master data for each object, but SQL will unload a lot of data in it. I took the following image from here

enter image description here

If you see, the Data of User table also repeated as much as the UserDetails table in the result set of the SQL query. This is similar to differentiating the performance factor (in your case, the main columns have more records, and then a detailed table).

If you are worried about performance, I would recommend that you use LINQ join with the same where clause when extracting data for a detailed table separately So, in your case: -

step1

  var context = new MyContext(); var applications = context.LoanApplications.Where(d => d.PropertyThatIWantToFilter = localVariable); 

and then step2

 var solicitors = from s in context.Solicitors join loanApp in context.LoanApplications select s.columns where loanApp. <<Same condition as in step 1 where clause>> 

Thank you, your question made me view my own code :-)

0
Aug 08 '13 at 10:30
source share



All Articles