I built an analytics mechanism that pulls out 50-100 rows of raw data from my database (lets call it raw_table), starts the statistical measurements of the beam on it in PHP, and then exactly 140 data appears, which I then need to store in another table (allows call him results_table). All these data points have very small values ("40", "2.23", "- 1024" are good examples of data types).
I know that the maximum number of columns for mysql is quite large (4000+), but there seems to be a lot of gray area since performance is really starting to get worse.
So, a few questions here about best practices:
1) 140 data points can be, if better, divided into 20 rows of 7 data points with the same " experiment_id", if there are fewer columns. HOWEVER, I always needed to pull ALL 20 rows (with 7 columns each, plus id, etc.), so I didn’t think it would be better than pulling 1 row out of 140 columns. So, the question is: is it better to store 20 rows of 7-9 columns (which all would need to be pulled out right away) or 1 row of 140-143 columns?
2) Given my sample data ( "40" "2.23" "- 1024" are good examples of what is saved) I think smallinttype structure. Any feedback there, on performance or otherwise?
3) Any other feedback on mysql performance issues or hints is welcome.
Thank you in advance for your entry.