Mysql - Creating row and column performance

I built an analytics mechanism that pulls out 50-100 rows of raw data from my database (lets call it raw_table), starts the statistical measurements of the beam on it in PHP, and then exactly 140 data appears, which I then need to store in another table (allows call him results_table). All these data points have very small values ​​("40", "2.23", "- 1024" are good examples of data types).

I know that the maximum number of columns for mysql is quite large (4000+), but there seems to be a lot of gray area since performance is really starting to get worse.

So, a few questions here about best practices:

1) 140 data points can be, if better, divided into 20 rows of 7 data points with the same " experiment_id", if there are fewer columns. HOWEVER, I always needed to pull ALL 20 rows (with 7 columns each, plus id, etc.), so I didn’t think it would be better than pulling 1 row out of 140 columns. So, the question is: is it better to store 20 rows of 7-9 columns (which all would need to be pulled out right away) or 1 row of 140-143 columns?

2) Given my sample data ( "40" "2.23" "- 1024" are good examples of what is saved) I think smallinttype structure. Any feedback there, on performance or otherwise?

3) Any other feedback on mysql performance issues or hints is welcome.

Thank you in advance for your entry.

+5
3

, , ( ) .

, 140 - - .

, , . / , . , .

NULL, , . , MySQL , .

+4

140 , , .

, 1x140 20x7 7x20 4x35 .. , PHP .

, ?

+3

, , , 1 (10 ^ 9) .

140 , 7 , , , .

innodb_buffer_pool, ; .

datapoint , (experiment_id, datapoint_id, value), .

However, the row size is not much different from the number of I / O operations. If we assume that your 1 billion datapoints do not fit in ram (which is currently not a safe assumption), the resulting performance will probably be roughly the same.

Probably the best database design allows multiple columns to be used; but it will use less disk space and maybe fill up faster if you use a lot of columns.

+3
source

All Articles