I donβt think you should use celery, Cron still sounds good to me in your case, but you can try celery.
For me, Celery is a Python module for [asynchronous] [distributed] task queues. It allows you to send long-running tasks to multiple processes running on multiple machines (but one process on one machine is still beautiful). When you need to do something that takes time (for example, generate thumbnails, talk to an external API, or create complex reports), you can use Celery to do this in the background without blocking the HTTP request for your user.
Some advantages of celery over kronab:
- you can run tasks asynchronously, exactly when there is at least one celery worker, free
- It scales well for multiple processes / machines.
- celerybeat is similar to crontab; but you can schedule tasks at a given date or intervals using python syntax in your .py settings
- you can apply speed limits (e.g. for some prioritization)
- There are monitoring tools such as Flower , which will give you a decent idea of ββwhich tasks failed and what succeeded.
Some disadvantages of celery:
- setup may take some time - you need to set up a broker queue and demonize workers in the production; cron will already be there
- each workflow is likely to use approximately the same amount of RAM as your Django process, it may cost you $ or you may just not have enough RAM to run celery, say, for the free AWS level
And also, if he just sends emails, you might consider using a paid service such as Postmark (I'm not affiliated with them) that will process the email for you.
Peter Kilczuk
source share