Make Django Test Database Visible to Celery

When the Django test case starts, it creates an isolated test database, so the database record is returned after each test is completed. I am trying to create an integration test with Celery, but I cannot figure out how to connect Celery to this ephemeral test database. In a naive setting, objects stored in Django are invisible to Celery, and objects stored in Celery are stored indefinitely.

Here is a test case example:

import json from rest_framework.test import APITestCase from myapp.models import MyModel from myapp.util import get_result_from_response class MyTestCase(APITestCase): @classmethod def setUpTestData(cls): # This object is not visible to Celery MyModel(id='test_object').save() def test_celery_integration(self): # This view spawns a Celery task # Task should see MyModel.objects.get(id='test_object'), but can't http_response = self.client.post('/', 'test_data', format='json') result = get_result_from_response(http_response) result.get() # Wait for task to finish before ending test case # Objects saved by Celery task should be deleted, but persist 

I have two questions:

  • How to make celery see objects that were in the Django test case?

  • How can I guarantee that all objects saved by Celery will be automatically rolled back after the test is completed?

I am ready to manually clean objects, if it is not automatically possible to do this, but deleting objects in tearDown even in APISimpleTestCase seems to roll back.

+2
django testing celery
source share
2 answers

This is possible by running a celery worker in a Django test case.

Background

The in-memory Django database is sqlite3. As stated on the description page for in-memory Sqlite databases , "[A] ll database connections sharing a database in memory should be in the same process." This means that as long as Django uses the test database in memory and the celery runs in a separate process, it is fundamentally impossible for Celery and Django to share the test database.

However, with celery.contrib.testing.worker.start_worker , you can start the celery worker in a separate thread as part of the same process. This worker can access the database in memory.

This suggests that celery is already set up in the usual way with the Django project.

Decision

Since Django-Celery uses some cross-threading, only those tests that do not run in isolated transactions will work. The test case should be inherited directly from SimpleTestCase or its equivalent Rest APISimpleTestCase and set the class allow_database_queries attribute to True .

The key should start with the celery worker in the setUpClass TestCase method and close it in the tearDownClass method. The key function is celery.contrib.testing.worker.start_worker(app) , which requires an instance of the current Celery application, supposedly obtained from mysite.celery.app , and returns a Python ContextManager that has the __enter__ and __exit__ methods that must be called in setUpClass and tearDownClass , respectively. There is probably a way to avoid manually entering the existing ContextManager using a decorator or something else, but I could not figure it out. The following is an example tests.py file:

 from celery.contrib.testing.worker import start_worker from django.test import SimpleTestCase from mysite.celery import app class BatchSimulationTestCase(SimpleTestCase): allow_database_queries = True @classmethod def setUpClass(cls): super().setUpClass() # Start up celery worker cls.celery_worker = start_worker(app) cls.celery_worker.__enter__() @classmethod def tearDownClass(cls): super().tearDownClass() # Close worker cls.celery_worker.__exit__(None, None, None) def test_my_function(self): # my_task.delay() or something 

For some reason, the tester is trying to use a task named 'celery.ping' , possibly to improve error messages in the event of a worker failure. Even setting perform_ping_check to False as an argument to the ot start_worker keyword still checks for its existence. The task she is looking for is celery.contrib.testing.tasks.ping . However, this task is not set by default. This could be achieved by adding celery.contrib.testing to INSTALLED_APPS in settings.py . However, this only makes it visible to the worker; not the code that the employee creates. The code that the worker creates executes assert 'celery.ping' in app.tasks , which does not work. Commenting on this, everything works, but modifying the installed library is not a good solution. I'm probably doing something wrong, but the workaround I stopped for is to copy a simple function somewhere, you can pick it up app.autodiscover_tasks() , for example celery.py :

 @app.task(name='celery.ping') def ping(): # type: () -> str """Simple task that just returns 'pong'.""" return 'pong' 

Now that the tests are done, there is no need to start a separate Celery process. The celery worker will be launched during the testing process of Django as a separate thread. This worker can see any databases in memory, including the default test database in memory. To control the number of workers there are options available in start_worker , but by default it is the only worker.

+4
source share

For your unittests, I would recommend skipping celery addiction, the following two links will provide you with the necessary information to start your unittests:

If you really want to test celery function calls, including a queue, I would readily install a docker combination with a combination of servers, workers, queues and extend the custom CeleryTestRunner from django-celery docs. But I will not see the benefits of this, because the test system is away from production to be representative.

+3
source share

All Articles