Django Asynchronous Processing

I have a bunch of Django requests that does some math (written in C and done through the Cython module) that can take an indefinite amount (of the order of 1 second) to execute. In addition, queries do not need access to the database and are independent of each other and Django.

Now everything is synchronous (using Gunicorn with sync working types), but I would like to make it asynchronous and non-blocking. In short, I would like to do something:

  • Get AJAX Request
  • Assign the task to an available worker (without blocking the main Django web application)
  • The employee performs the task for some unknown time.
  • Django returns the result of the calculation (list of strings) as JSON whenever the task completes

I am very new to asynchronous Django, and so my question is the best stack for this.

Is this some kind of process for which the task queue is well suited? Anyone recommend Tornado + Celery + RabbitMQ or maybe something else?

Thanks in advance!

+7
source share
2 answers

Celery would be perfect for this.

Since what you are doing is relatively simple (read: you don’t need complicated rules about how tasks should be redirected), you could probably leave using the Redis backend, which means you don’t need to configure / configure RabbitMQ (which in my experience is more complicated).

I am using Redis with the largest Celery construct, and here are the relevant bits of my configuration:

  # Use redis as a queue
 BROKER_BACKEND = "kombu.transport.pyredis.Transport"
 BROKER_HOST = "localhost"
 BROKER_PORT = 6379
 BROKER_VHOST = "0"

 # Store results in redis
 CELERY_RESULT_BACKEND = "redis"
 REDIS_HOST = "localhost"
 REDIS_PORT = 6379
 REDIS_DB = "0"

I also use django-celery , which makes integration with Django happy.

Comment if you need more specific advice.

+14
source

Since you plan to make it asynchronous (presumably using something like gevent), you can also consider creating a multi-threaded / forked backend service for computational work.

An asynchronous interface server can handle all light work, receive data from databases suitable for async (redis or mysql with a special driver), etc. When you need to perform the calculation, the interface server can send all the input data to the backend server and get the result when the backend server completes its calculation.

Since the external interface is asynchronous, it will not block waiting for results. The advantage of this, in contrast to the use of celery, is that you can return the result to the client as soon as it becomes available.

 client browser <> async frontend server <> backend server for computations 
0
source

All Articles