I am creating a simple SQLite database to store sensor readings. The tables will look something like this:
sensors - id (pk) - name - description - units sensor_readings - id (pk) - sensor_id (fk to sensors) - value (actual sensor value stored here) - time (date/time the sensor sample was taken)
The application will collect about 100,000 sensor readings per month from about 30 different sensors, and I would like to store all the sensor readings in the database as much as possible.
Most requests will be in the form
SELECT * FROM sensor_readings WHERE sensor_id = x AND time > y AND time < z
This query usually returns about 100-1000 results.
So, the question is how large the sensor_readings table is before the above request becomes too time-consuming (more than a couple of seconds on a standard PC).
I know that one fix can be to create a separate sensor_readings table for each sensor, but I would like to avoid this if it is not needed. Are there other ways to optimize this database schema?
sqlite
bengineerd
source share