How to efficiently store a large number of rows in a database

What is the best way to store large amounts of data in a database? I need to store the values ​​of various environmental sensors with time stamps. I did some tests with SQLCE, it works great for a few 100,000 rows, but if it goes to millions, the selection will be terribly slow. My actual tables:

Datapoint:[DatastreamID:int, Timestamp:datetime, Value:float] Datastream: [ID:int{unique index}, Uint:nvarchar, Tag:nvarchar] 

If I request Datapoints for a specific Datastream and date range, it takes a lot of time. Especially if I run it on the built-in WindowsCE device. And this is the main problem. On my development machine, the request took ~ 1sek, but it took ~ 5 minutes on the CE device

every 5 minutes I register 20 sensors, 12 per hour * 24 hours * 365 days = 105,120 * 20 sensors = 2 102 400 (rows) per year

But it can be even more sensors!

I thought of some kind of web server, but the device may not always have an internet / server connection.

Data should be displayed on the device itself.

How can I speed this up? choose another table layout, use a different database (sqlite)? I am currently using .netcf20 and SQLCE3.5

Some tips?

+6
language-agnostic rdbms sql-server-ce
source share
4 answers

I am sure that any relational database will meet your needs. SQL Server, Oracle, etc. It’s important to create good indexes so that your queries are effective. If you need to scan a table just to find a single record, it will be slow no matter which database you use.

If you always request a specific DataStreamID value and a Timestamp value, create an index for it. Thus, instead of scanning, it will search for the index.

+2
source share

The key to quick access is using one or more indexes.

The database of two million rows per year is very manageable.

Adding indexes will slow down somewhat, INSERTS, but your data does not arrive so fast, so this should not be a problem. If the data came faster, you might have to be more careful, but it should be much more data at a much faster pace than now to cause concern.

Do you have access to SQL Server or even MySQL?

0
source share

Your design should have: The main key in the table. Integer PC is faster.

You need to analyze the selected queries to see what happens behind the scenes. The choice should run SEEK instead of scanning

If 100K makes it slow, you should look at the request through the analyzer.

This can be a little slow if you have 100M rows, not 100K rows

Hope this helps

0
source share

Can i use SQL Server Express Edition? You can create indexes on it, as in the full version. I work great with databases that contain over 100 million rows in SQL Server. SQL Server Express Edition limits the database size to 10 GB, if that is good, then free should work for you.

http://www.microsoft.com/express/Database/

0
source share

All Articles