Using a DataSet when the data source is very large?

I read several MS articles on when to use DataSets in conjunction with a database from a WinForms application. I definitely like the ease of use of DataSets, but you have a few problems using them with a large data source. I want to use a SQLite database to locally process processed weblog information. This could potentially lead to tens of thousands of rows of data.

When a DataSet is populated through a database table, does it ultimately contain ALL of the data from the database or contains only part of the data from the database?

Can I use a DataSet to add rows to the database, for example, perform an update, somehow β€œclear” what stores the DataSet in memory, and then add additional rows?

So, is it possible to substantially control what currently stores the DataSet in memory? If a DataSet represents a table containing 100,000 rows, does this mean that all 100,000 rows must be loaded from the database into memory before it becomes usable?

Thanks.

+4
source share
1 answer

You have very important points here. These points were raised at the beginning of .Net when we suddenly went into the disconnected state introduced in .NET.

- . (), . , ( ), . 201 . 200 , 200 , β„– 201, 200. , , 200 , 1000. .

, DataSet/DataTable. . DataSet/DataTable , .. Datatables .

, Entity Framework. , .

, ADO.net. , , ADO.NET . , , 100K, , .

, , , , .

+2

All Articles