Efficient way to handle large amounts of data in MySQL

I have a scenario and two options for achieving it. Which one will be more effective?

I use mySQL to store student attendance (about 100 million). And later, use traffic data to build charts and results based on user choices.

Approach .1) Store student attendance data for each day in a new line (which will increase the number of lines exponentially and reduce processing time)

Approach .2) Store a serialized or JSON formatted data string for one year for each student in a row (which will increase processing time when updating attendance every day and reduce the size of the database)

+6
source share
1 answer

At first, I think you're confused, the number of rows will increase linearly, not exponentially, which is a big difference.

The second 100k is nothing for the database. even if you store 365 days, which is only 36 million, I have it in a week,

A third store in JSON may complicate a future request.

Therefore, I suggest going with the approach 1

Using its own index, design, and fast hard drive, db can handle a billion records.

You can also save historical data in a different scheme, so the current data is a little faster, but this is just a minor tuneup

+9
source

All Articles