I have a django application in which it has celery functionality, so I can successfully manage celery, as shown below.
celery -A tasks worker --loglevel=info
but as a well-known fact, we need to run it as a daemon, and therefore I wrote the celery.conf file below inside /etc/supervisor/conf.d/
; ================================== ; celery worker supervisor example ; ================================== [program:celery] ; Set full path to celery program if using virtualenv command=/root/Envs/proj/bin/celery -A app.tasks worker --loglevel=info user=root environment=C_FORCE_ROOT="yes" environment=HOME="/root",USER="root" directory=/root/apps/proj/structure numprocs=1 stdout_logfile=/var/log/celery/worker.log stderr_logfile=/var/log/celery/worker.log autostart=true autorestart=true startsecs=10 ; Need to wait for currently executing tasks to finish at shutdown. ; Increase this if you have very long running tasks. stopwaitsecs = 600 ; When resorting to send SIGKILL to the program to terminate it ; send SIGKILL to its whole process group instead, ; taking care of its children as well. killasgroup=true ; if rabbitmq is supervised, set its priority higher ; so it starts first priority=998
but when I tried to update the supervisor like supervisorctl reread and supervisorctl update , I got a message from supervisorctl status
celery FATAL Exited too quickly (process log may have details)
So, I went to the worker.log file and saw the error message below
Running a worker with superuser privileges when the worker accepts messages serialized with pickle is a very bad idea! If you really want to continue then you have to set the C_FORCE_ROOT environment variable (but please think about this before you do). User information: uid=0 euid=0 gid=0 egid=0
So why did he complain about C_FORCE_ROOT , although we set it as an environment variable inside the confisor confisor file? what am i doing wrong in the above conf file?
django celery supervisord celery-task supervisor
shiva krishna
source share