Get Celery To Use Django Test Base

I am trying to write some unit tests for some celery tasks in my Django application. These tasks take the model identifier as an argument, do some things, and update the model. When you start devserver and the celery worker, everything works fine, but when I ran my tests it became clear that the celery task does not use the django test db, which is created and destroyed as part of the test run. The question is, how can I get celery to use the same temporary db as the rest of my tests?

As you can see, I use parameter overrides, which are offered in each answer for similar problems.

UPDATE: It has been found that instead of passing the identifier of the object to the task and retrieving the task from the database, if I just transfer the object to my task, the tests work correctly, without affecting the functioning of the task. So, at least for now, this will be my decision.

In my test:

class JobTest(TestCase): @override_settings(CELERY_ALWAYS_EAGER=True, CELERY_EAGER_PROPAGATES_EXCEPTIONS=True, BROKER_BACKEND='memory') def test_Job_Complete(self): job = models.Job() job.save() tasks.do_a_thing(job.id) self.assertTrue(job.complete) 

In my task:

 @celery.task def do_a_thing(job_id): job = models.Job.objects.get(pk=job_id) bunch_of_things(job) job.complete = True job.save() 
+6
source share
3 answers

There are no obvious problems with your code. You do not need to run celery. With these settings, the celery will run the task synchronously and will not actually send anything to the message queue.

You can’t easily run tests with live celery workers anyway, because each test is wrapped in a transaction, so even if they connect to the same database (which they are not), transactions always roll back from the test and never available to the employee.

If you really need to do this, see fooobar.com/questions/964873 / ....

+1
source

I fought a similar problem. The following solution is not clean , but it works.

  • Create a separate Django settings file that inherits from your main one. Let me call it integration_testing.py .
  • Your file should look like this:
    from .settings import *

    DATABASES = { 'default': { 'ENGINE': '<your engine>', 'NAME': 'test_<your database name>', 'USER': <your db user>, 'PASSWORD': <your db password>, 'HOST': <your hostname>, 'PORT': <your port number>, }

  • Create a shell script that installs your environment and starts the celery worker:

    #!/usr/bin/env bash

    export DJANGO_SETTINGS_MODULE="YOURPROJECTNAME.settings.integration_testing"

    celery purge -A YOURPROJECTNAME -f && celery worker -A YOURPROJECTNAME -l debug

  • The above works if you set up celery in this way:

    app = Celery('YOURPROJECTNAME')

    app.config_from_object('django.conf:settings', namespace='CELERY')

  • Run the script in the background.

  • Compile all the tests that Celery inherits from TransactionTestCase (or APITransactionTestCase in the django-rest-framework)

  • Run your unit tests that use celery. Any celery tasks will now use your db test. And I hope for the best.

+1
source

One way to ensure that a celery worker is configured to use the same test database as tests is to generate a celery worker inside the test itself . This can be done using celery.contrib.testing.worker.start_worker in the setUpClass TestCase method.

You should also use SimpleTestCase from Django or APISimpleTestCase from Rest, not a simple TestCase so that the Celery thread and the test thread can see the changes that each other makes in the test database. Changes will still be destroyed at the end of testing, but they will not be destroyed between tests unless you manually destroy them in the tearDown method.

0
source

All Articles