Running Django MySQL tests in memory

I have a django 1.4 project using mysql as a backend. I have a test setup for working in memory

if 'test' in sys.argv: DATABASES['default'] = {'ENGINE': 'django.db.backends.sqlite3'} 

The problem is that I need to use the mysql functionality (full text indexes).

Is there a way to run django MySQL in memory for testing?

My project is based on full-text indexes. When the project is being developed on i syncdb, then execute the .sql file with sql to create full text indexes.

I would like to use django orm full-text search in the functions I tested. I am trying to manually add a full text index for each test initialization, for example:

cursor.execute('alter table mytable add fulltext(one, two)')

This does not work when using sqlite (I believe because sqlite does not support Fulltext indexing)

The above sql DOES works when I delete tests in memory. I like the speed of memory testing. Is there a way to run mysql in memory?

How do people test applications that rely on specific database features? e.g. full text indexing or gis etc. Do they need to run tests normally on the file system?

thanks

+4
source share
3 answers

Do not test MySQL. MySQL has been tested by its developers. Your task is to code so that you can replace the MySQL connection with a Python mock object that will return what MySQL will return (or approve if the query sent to MySQL is what you want)

-4
source

MySQL has a MEMORY storage engine. You can activate it with the OPTIONS key:

 if 'test' in sys.argv: DATABASES = { 'default': { 'ENGINE': 'django.db.backends.mysql', 'HOST': 'localhost', 'NAME': 'foo', 'USER': 'bar', 'PASSWORD': 'baz', 'OPTIONS': { 'init_command': 'SET storage_engine=MEMORY' } } } 

BUT in accordance with the documentation :

 MEMORY tables cannot contain BLOB or TEXT columns. 

Therefore, I believe that this is pretty useless for your (and many other) use cases.

I found some other tips for speeding up MySQL tests in this thread . Remember to read Daniel Rosemanโ€™s answer about using the INNODB mechanism.

+12
source

As pointed out in the monkut interrogative comment, you can get some MySql acceleration by storing the test database on a RAM disk. for example, the tmpfs file system, if it runs on Linux. However, if you are concerned about speed during tests, this may not provide much speed improvement, since MySql usually caches its data in memory.

The approach I used is to write a test decorator that will skip a test that is not supported by this database database and then use different settings.py files to test on different servers.

Here I used django-nose to help write to the decorator to skip the test if any test databases use SQLite.

 from nose import SkipTest from django.conf import settings def skip_if_sqlite(test_fn): """Nose test decorator to skip test if testing using sqlite databases. This may be useful when we know that a test will fail when using sqlite, possibly because the test uses functionality not supported on sqlite such as database views or full text indexes. This decorator can be used as follows: @skip_if_sqlite def my_test_that_shouldnt_run_if_sqlite_database(self): pass """ @functools.wraps(test_fn) def wrapper(*args, **kwargs): # Skip test if any database contain a sqlite engine. for alias, db_settings in settings.DATABASES.iteritems(): if 'sqlite' in db_settings['ENGINE']: raise SkipTest() return test_fn(*args, **kwargs) return wrapper 

Then configure settings.test.mysql and settings.test.sqlite with different DATABASE settings. If you run tests using settings.test.sqlite , tests that rely on specific MySql functions will be skipped, but if you run tests with settings.test.mysql , tests will run.

This approach allows you to specifically target tests to a specific database backend, but it provides the flexibility to run most of your tests against a faster SQLite database during development.

+3
source

All Articles