Long db query result with simultaneous recording at the same time

I am curious. I have such a stream: we have a large collection / table with a lot of data. And have some select oriented query that is long and takes 3 seconds.

However, we look at a highly competitive environment, and every second we get 100 new entries in our database.

So, let's say we have a request, and before the start of this request, we have 1000 elements that satisfy this request. The request takes 3 seconds, and each second contains 50 new elements that correspond to the request added to db. My questions are the result of this query for me (this is another 1000 or 1150 or something average) and how it depends on different database engines (SQL, NoSQL). ABOUT

This is not a question of the exact number, but a little more - why it will be that number.


It seems the question is a bit broad. Let the limit DB with MySQL, Postgres, MongoDB and Cassandra.

+6
source share
1 answer

Generally speaking (mainly because you did not specify a specific database), the concurrency level of the database is configurable and falls under the category of performance tuning.

Some common locking locks:

  • ROW - Only one row of data is blocked at a time.
  • PAGE - some group of lines is blocked at a time
  • TABLE - the entire table is locked.

So, if you used ROW level locking, you are likely to get all 1,150 results due to higher locking overhead. Or, if you used TABLE level locking, you would get 1000 results very quickly, but due to the fact that your data stream was blocked from writing to your db for 3 seconds.

+1
source

All Articles