I am developing a web interface for a hardware sensor that produces approx. 250 kB / s raw data (125 kS / s, 16 bits per sample). The web application is designed to visualize (use Canvas) and store (using IndexedDB) this data in real time. I'm having performance issues with indexedDB repository.
This application is designed to work for several days or even weeks and should reliably store large amounts of data (from tens to several hundred MB).
Since writing seems to be a common performance issue, I rewrote my application to only store a large chunk of data every 5 seconds as an object with an unresolved whole array. Such works, but I still get very intermittent rendering performance, high CPU usage and high memory. Exact storage code:
//dataDB = indexedDB database opened in another function //slice = data to be stored //sessionID = object store index //this function is called about once every 5 seconds //with 700 000 values in the slice array //slice is a multidimensional array function storeFastData(slice, sessionID){ var s = dataDB.transaction(["fastData"],"readwrite").objectStore("fastData"); var fdreq = s.get(sessionID); fdreq.onsuccess = function(e){ var d = fdreq.result; for(i = 0; i < slice.length; i++){ d.data[i][1] = slice[i][1]; } s.put(d); } }
Namely:
- Is IndexedDB the right choice for this application?
- Am I an idiot in how I did this? This is the first IndexedDB based project I am doing.
- I read that using WebWorkers can at least fix stuttering issues, as it might work on a different thread. Will this solve my performance problems?
I am ready to use the new (draft) functionality, but requiring user interaction for storage beyond 5 MB (for example, using the Quota Management API) every time the application opens, it is rather intrusive and, if at all possible, I would like to avoid this .
I do not use jquery. This cannot be written as a native application (it must be run inside the browser).
javascript indexeddb
user36129
source share