Which is faster? Data search

Is it faster to make one trip to the database and return 3000 + plus lines, and then manipulate them in .net and LINQ, or make 6 calls faster, returning a couple of 100 lines at a time?

+4
source share
12 answers

Is this for a single user, or will many users request data? A single database call will scale better under load.

+1
source

It will completely depend on the speed of the database, network bandwidth and latency, the speed of the .NET machine, actual queries, etc.

In other words, we cannot give you a true, general answer. I know which sounds are easier to code :)

Unfortunately, this is a thing that you cannot easily test without having an exact copy of the production environment. Most test environments are slightly different from the production environment, which can seriously change the results.

+6
source

Speed ​​is just one consideration among many.

How flexible is your code? How easy is it to revise and expand when requirements change? How easy is it for another person to read and maintain their code? How portable is your code? what if you switch to another DBMS or another programming language? Are any of these considerations important in your case?

Having said that, go on a solo trip there and back if all other things are equal or inconsequential.

You mentioned that a one-way round trip can lead to reading data that you do not need. If all the data you need can be described in one result table, then it should be possible to develop a query that will receive this result. This result table may provide some result data in more than one row if the query denormalizes the data. In this case, you can get some speed by getting data in several result tables and compiling the result yourself.

You did not provide enough information to know how much programming effort would be making up a single query or compiling data returned by 6 queries.

As others have said, it depends.

+1
source

the problem I have here is that I need all this, I just need to display it separately ...

The answer to your question: 1 query for 3,000 rows is better than 6 queries for 500 rows. (given that you revert all 3,000 rows back)

However, you do not want (want) to display 3,000 lines at a time, right? In all likelihood, regardless of using Linq, you'll want to run aggregate queries and get a database to do the job. You should hope that you can build SQL (or the Linq query) to execute all the necessary logic in one shot.

Not knowing what you are doing is hard to be more specific.

* If you absolutely need to return all the lines, then study the ToLookup () method for your linq IQueryable <T>. This is very convenient for grouping results in non-standard ways.

Oh, and I highly recommend LINQPad (free) for polling queries with Linq. It contains many examples, and also shows sql and lambda forms so you can familiarize yourself with the Linq ↔ lambda ↔ Sql form.

+1
source

If you know which of the six SQL statements you intend to execute in advance, you can associate them with one call to the database and return multiple result sets using ADO or ADO.NET.

http://support.microsoft.com/kb/311274

+1
source

Well, the answer is always "dependent." Do you want to optimize database loading or application loading?

My common answer in this case would be to use as specific queries as possible at the database level, therefore using 6 calls.

0
source

thanks

I kind of thought of a β€œpark with a ball,” but it sounds like it's a choice ... the difference is probably small.

I thought that getting all the data and managing it in .net would be the best - I have nothing concrete to base it on (hence the question), I am simply inclined to believe that calls to the DB are expensive, and if I know what I need all the data ... get it in one go?!?

0
source

Part of the problem is that you did not provide enough information to give you an accurate answer. Obviously, the available resources must be taken into account.

If you pull 3,000 lines infrequently, this may work for you in the short term. However, if 10,000 people say that they are executing the same request (ignoring the effects of the cache), this can be a problem for both the application and db.

Now, in the case of something like pagination, it makes sense to use exactly what you need. But it would be a general rule to try to pull out only what is needed. It is much wider to use a scalpel instead of a broad word. =)

0
source

If you are talking about a query that has already been run by SQL (optimized by SQL Server), working with LINQ or SqlDataReader may have the same performance.

The only difference will be "how hard will your code support?"

LINQ does not query anything in the database until you ask for the result with ".ToList ()" or ".ToArray ()" or even ".Count ()". LINQ dynamically builds your query, so it’s exactly the same as with SqlDataReader, but with runtime checking.

0
source

I always adhere to the rule "bring what I need," and nothing more ... the problem that I have here is that I need all this, I just need to display it separately.

So say ... I have a table with user ID and type. I want to display all records with user ID and display on the page in grids, for example, separated by typeid.

At the moment, I call sproc, which "selects field1, field2 from the tab where userid = 1", then on the page specify the grid data source with t in the tab where typeid = 2 select t;

Instead of calling different sproc "select field1, field2 from tab, where userid = 1 and typeid = 2" 6 times.

??

0
source

Instead of thinking, why don't you try both and measure the results?

0
source

It depends on the

1) if your connector implementation protects many objects. And you have large lines (e.g. blobs, contry polygons, etc.), you have a problem, you need to load a lot of data. I optimized once the code that had this problem, and it just downloaded a few megabytes of garbage all the time through localhost, and my software runs 10 times faster because I removed the precing using the option

2) If your lines are small, and you have a good chance that you need to read all 3000, you better go for a large set of results

3) If you do not use prepared instructions, all requests should be analyzed! Bigger results could be better.

Hope this helps

0
source

All Articles