Which database deals with very large result sets?

I am currently working on a PHP application (preview release).

Background

We have a table in our MySQL database, which is expected to grow extremely large - for one user it will not be an unusual owner of 250,000 rows in this table. Each row of the table indicates the quantity and date, among other things.

In addition, this particular table is read (and written) very often - on most pages. Given that each row has a date, I use GROUP BY dateto minimize the size of the result set given by MySQL. The lines contained in the same year can now be considered as just one.

However, a typical page will still have a result-result between 1000-3000 results. There are also places where many SUM(), many dozens — if not hundreds — of thousands of lines are executed .

MySQL attempt

On a regular page, MySQL usually took about 600-900 ms. Usage LIMITand bias did not help performance, and the data was very normalized, and therefore it does not seem that further normalization will help.

Worse, there are parts of the application that need to extract 10,000-15,000 rows from the database. The results are then used in calculating PHP and formatted accordingly. Given this, MySQL performance was not acceptable.

Attempt MongoDB

MongoDB, - 250 2000 . $group , , , . , , , //, , , , , .

Redis, , , Redis.

, . , - .

:

  • , / , , ?
  • ? .

, . , - , - - /.

+4
1

, (GROUP BY), , ?

, , , .

MySQL MongoDB . , , povide, (), /, .

, MySQL, MongoDB, , , , MongoDB ().

LIMIT a OFFSET , , . MySQL , .

, MySQL ( IO), MongoDB , $out, 16 ( ).

, :

, , .., - -, " ". MongoDB: http://docs.mongodb.org/ecosystem/use-cases/pre-aggregated-reports/

, - , , , , .

+2

All Articles