I am using a sql server and I have a specific table that can contain ~ 1 million ~ 10 million recrdords max.
In each record that I get, I do some checks (I run a few simple lines of code), and then I want to note that the records were marked in DateTime.Now; so what I do is get a record, check some things, run the "update" request to set the "last_checked_time" field in DateTime.Now, and then move on to the next record. Then I can get all the records sorted by last_checked_time (ascending), and then I can iterate over them sorted by check time.
Is this a good practice? Can it still stay fast as long as I have no more than 10 million entries in this table?
I read somewhere that every update request is actually deleting and creating a new record.
I would also like to mention that these entries will often be retrieved on my ASP.net website ..
I was thinking of writing "last_checked_time" in a local txt file / binary, but I guess that would mean implementing what the database can already do for you.
source share