Optimal row size for fetching at a time from a large table

I have a very large table containing about 20 million rows. I should get about 4 million rows from this table based on some filtering criteria. All columns in the filtering criteria are covered by a certain index, and the statistics table corresponds to the date.

I was suggested that instead of loading all the rows in one pass, use the batch size, for example. let's say 80,000 rows at a time, and it will be faster than loading all rows at a time.

Can you guess if this idea makes sense?

If that makes sense, what would be the optimal row size to load at a time.

+5
source share
2 answers
  • , sql.
  • .
  • . . 10 000.
  • , .
+1

SSIS ... , , .

Business Intelligence...

0

All Articles