What would be the best way to collect about a million records from a database?

I need to get and show data on a web page, the number of records of which can vary depending on filters from 500 records to 1 million records.

Will caching be used here, since I think a million records in memory is not a good idea. SqldataReader ?

Paging is a mandatory implementation, of course. Reading 1 million records is the worst case scenario (stupid All filter in cases of use!).

Should I use a connected architecture (SqlDataReader) or a disabled architecture (DataSets)?

+6
sql-server
source share
4 answers

First of all, think of it this way: displaying 1 million records is absolutely meaningless for any user. So, you should think about what the user expects to see. Maybe a resume ?! Perhaps break pages into pages, such as 25 or 50 or 100 records. Any of these approaches will not require you to store 1 M records at a time in memory.

In addition, when you run a query to the SQL database and use SqlDataReader, you will not receive all the records, but instead the SQL driver will send the query to the SQL server, the server will execute the query, prepare the result to set and create the cursor only forward in the server. The driver will then retrieve the record at a time, each time you call Read () on your SqlDataReader. The behavior is very similar if you use LINQ to SQL, which uses deferred execution. The result set is not transmitted completely until (or if) you specifically request each row.

So a simple pagination request will do the trick. Or in other cases, some kind of summary report that combines the data from these 1 million records of one or two pages of relevant data.

Of course, if you need to move back and forth through the pages, then some kind of caching may make sense, but think about it again: how often the user really wants to view a million records - perhaps never.

As a final note, if you are doing pagination, make sure that the method used to implement pagination depends on the SQL server sending the data one page at a time, and not just reading 1 million entries in ASP.NET, but then breaking the pages into a local copy of the data, because it would be very inefficient and slow. The following is an example of a SQL Server query that performs pagination: SO Question # 109232

+10
source share

I agree with the other defendants. displaying 1M records is ridiculous. However, you can display the first X entries and page through.

Focus in stored procedure fetch

 ALTER PROCEDURE [dbo].[MyHugeTable_GetWithPaging] ( @StartRowIndex int, @MaximumRows int ) AS SET NOCOUNT ON Select RowNum, [UserName] From (Select [ID], [UserName] Row_Number() Over(Order By [ID] Desc) As RowNum From dbo.[MyHugeTable] t) As DerivedTableName Where RowNum Between @StartRowIndex And (@StartRowIndex + @MaximumRows) 
+3
source share

If your server cannot cache 1 million records, how do you think your custom web browser will process 1 million records for HTML?

Consider paging ( here is an example with 1 million records )

Also think that a user never wants more than 30 to 50 entries. You either show them too low a level of detail, or you need to filter more.

+1
source share

I suggest using a dynamic query with paging. therefore, when you click on certain page selection records for those pages only. To retrieve a record from a database from a specific range, use the following query.

like this.

 Create proc Test @take smallint, @skip smallint, @orderBy nvarchar(20), @subscriptionid smallint, as DECLARE @SQLQuery AS NVARCHAR(max) SET @SQLQuery=' Select ROW_NUMBER() OVER (ORDER BY P.ProductId desc) as RowNum,* from product" set @ SQLQuery=@SQLQuery + ' and Subscriptionid='+CONVERT(nvarchar, @subscriptionid) set @SQLQuery= ';WITH Results_CTE AS ( ' +@SQLQuery set @SQLQuery= @SQLQuery +' ) SELECT * FROM Results_CTE WHERE RowNum > '+CONVERT(nvarchar, @skip)+' AND RowNum <= '+CONVERT(nvarchar, @ skip+@take ) --//paging'; END EXECUTE sp_executesql @SQLQuery 
0
source share

All Articles