Best practice for local database cache?

I am working on an application that depends on a MySQL database for pieces of content. In some cases, the application will work in an environment with limited Internet access (UMTS), in particular, with large delays.

The application user can log in, and most of the content for the application user interface is retrieved from the MySQL database. To prevent delays after a user logs in, I would like to cache the contents of the database on the client side as much as possible. New content should only be retrieved from the database if appropriate changes are made. Is there a general approach to solving this problem? Any literature describing solid solutions to this problem?


Small update: now I see CouchDB as a solution for my specific use case. The main reason is that:

it allows users and servers to access and update the same shared data during disconnection, and then replicate these changes in two directions later.

(from: http://couchdb.apache.org/docs/overview.html )

So far this looks promising.

+3
source share
2 answers

In fact, we run a local copy of the database, and most software just talks to the local database, which is always available. Then we have a background process that works to synchronize the two databases.

, . , . , .

+4

, , JavaScript, . Ajax-, , - ... . , .

0

All Articles