As pointed out in the monkut interrogative comment, you can get some MySql acceleration by storing the test database on a RAM disk. for example, the tmpfs file system, if it runs on Linux. However, if you are concerned about speed during tests, this may not provide much speed improvement, since MySql usually caches its data in memory.
The approach I used is to write a test decorator that will skip a test that is not supported by this database database and then use different settings.py files to test on different servers.
Here I used django-nose to help write to the decorator to skip the test if any test databases use SQLite.
from nose import SkipTest from django.conf import settings def skip_if_sqlite(test_fn): """Nose test decorator to skip test if testing using sqlite databases. This may be useful when we know that a test will fail when using sqlite, possibly because the test uses functionality not supported on sqlite such as database views or full text indexes. This decorator can be used as follows: @skip_if_sqlite def my_test_that_shouldnt_run_if_sqlite_database(self): pass """ @functools.wraps(test_fn) def wrapper(*args, **kwargs):
Then configure settings.test.mysql and settings.test.sqlite with different DATABASE settings. If you run tests using settings.test.sqlite , tests that rely on specific MySql functions will be skipped, but if you run tests with settings.test.mysql , tests will run.
This approach allows you to specifically target tests to a specific database backend, but it provides the flexibility to run most of your tests against a faster SQLite database during development.
source share