This is a very good question, and there is no single good answer for each case. I used and saw various strategies, each of which has its pros and cons.
Downloading right away is good; it's good with small tables and filtering data. When the user goes to other pages, no additional requests are sent to the database. The downside is the high cost at the beginning of the interaction and very heavy memory requirements. When this is typical, the user will not scroll through all the data, it is a waste of resources. However, for small dictionaries this is the best solution.
Paging using constraint / offset (PostgreSQL), rownum (Oracle), or any other keyword. The maximum load time for the first page. Minus - each next page loads slower, with more hard work on the database site. The best strategy is when the user usually sees one or more of the first pages. Worse, when the user will view all the data. It works well when the set is ordered by the primary key, however it is terrible when the data set is filtered and not ordered by index. For each page, it causes filtering (possibly with a full table scan) and sorting the complete data set in memory!
Scrolling with a database cursor. This is the most dangerous strategy. The database opens the cursor for the query, and when the user needs the next page, the cursor moves. The optimal strategy for the case when the user usually scrolls through all the data. Preferred strategy for reporting. Hovewer, in interactive user mode, a database connection is blocked . No one else can use it! And the number of connections to the database is limited! It is also very difficult to implement in a web application where you do not know if the user has closed the browser or is still analyzing the data - you do not know when to release the connection.
Danubian sailor
source share