First of all, think of it this way: displaying 1 million records is absolutely meaningless for any user. So, you should think about what the user expects to see. Maybe a resume ?! Perhaps break pages into pages, such as 25 or 50 or 100 records. Any of these approaches will not require you to store 1 M records at a time in memory.
In addition, when you run a query to the SQL database and use SqlDataReader, you will not receive all the records, but instead the SQL driver will send the query to the SQL server, the server will execute the query, prepare the result to set and create the cursor only forward in the server. The driver will then retrieve the record at a time, each time you call Read () on your SqlDataReader. The behavior is very similar if you use LINQ to SQL, which uses deferred execution. The result set is not transmitted completely until (or if) you specifically request each row.
So a simple pagination request will do the trick. Or in other cases, some kind of summary report that combines the data from these 1 million records of one or two pages of relevant data.
Of course, if you need to move back and forth through the pages, then some kind of caching may make sense, but think about it again: how often the user really wants to view a million records - perhaps never.
As a final note, if you are doing pagination, make sure that the method used to implement pagination depends on the SQL server sending the data one page at a time, and not just reading 1 million entries in ASP.NET, but then breaking the pages into a local copy of the data, because it would be very inefficient and slow. The following is an example of a SQL Server query that performs pagination: SO Question # 109232
Mike dinescu
source share