I can't get my celery workers to constantly listen to the default queue. Celery is constantly coming out.
$: docker-compose up Starting tasker_rabbitmq_1 Starting tasker_celery_1 Attaching to tasker_rabbitmq_1, tasker_celery_1 tasker_celery_1 exited with code 1 rabbitmq_1 | rabbitmq_1 | RabbitMQ 3.6.1. Copyright (C) 2007-2016 Pivotal Software, Inc. rabbitmq_1 | ## ## Licensed under the MPL. See http://www.rabbitmq.com/ rabbitmq_1 |
I am trying to create an application with a separate task layer as a container of separately defined tasks. So, architecture:
- Web / App Layer ( Django) on EBS
- Employment Level: Celery + RabbitMQ as a Docker Container
This is what I have:
Folder structure:
-tasker -tasker -tasks.py -celeryconfig.py - __init__.py -Dockerfile -docker-compose.yml -requirements.txt
tasks.py:
from celery import Celery from celery import task celery = Celery('tasks', broker='amqp:// guest@localhost //') import os @celery.task def add(x, y): return x + y
Dockerfile:
FROM python:3.4 ENV PYTHONBUFFERED 1 WORKDIR /tasker ADD requirements.txt /tasker/ RUN pip install -r requirements.txt ADD . /tasker/
docker-compose.yml:
rabbitmq: image: tutum/rabbitmq environment: - RABBITMQ_PASS=mypass ports: - "5672:5672" - "15672:15672" celery: build: . command: celery worker --app=tasker.tasks volumes: - .:/tasker links: - rabbitmq:rabbit
Is there something I am missing? Why is celery output with code 1?
source share