Django Celery in production

I have everything I want to do with django-celery running on my development machine locally. I run Django, djcelery , cellery and broker (Amazon SQS). It sends tasks and just works.

I can install all this, as I did locally (i.e. everything on one machine), but what happens when I want to distribute tasks to other machine / general tasks, etc.? Is this a copy of the current computer (with Django, djcelery and celery ) and the whole connection with the same SQS? How it works? If they all contact the same broker, do they just “know”? or does it not work like that?

Is it possible to start with just one machine, as I did in development (I will dismantle celery in production)?

+4
source share
1 answer

Amazon SQS is a simple queue service, jobs are started and then removed from the queue after completion. Celery just reads this lineup.

Celery can scale both horizontally and vertically. Do you need celery to process more jobs faster? Give your computer more resources, increase the number of employees, that is, vertical scaling, or load smaller machines with horizontal scaling. In any case, your celery workers all consume the same queue on SQS. It depends on what your celery tasks do, how the rest of your infrastructure will affect. If they write to the database, the more workers you have, the higher the load on your database, so you will also need to look at scaling.

OK, to start with "everything" on one machine approach. As demand for your application grows, you can start looking at moving celery workers to more machines, or provide more resources on one server.

Does it help? :)

+2
source

All Articles