I have a Flask application with a Celery and Redis worker and it works fine, as expected when working on the local computer. Then I tried to dockerize the application. When I try to create / start services (for example, a flash application, celery and Redis) using sudo docker-compose up, all services are started except Celery, and showing the error as
ImportError: no module named 'my_celery'
But the same code runs on the local machine without any errors. Can anyone suggest a solution?
Dockerfile
FROM python:3.5-slim
WORKDIR celery_sample
ADD . /celery_sample
RUN pip install -r requirements.txt
EXPOSE 8000
Docker-compose.yml
version: "3"
services:
web:
build:
context: .
dockerfile: Dockerfile
command: "python my_celery.py"
ports:
- "8000:8000"
networks:
- webnet
volumes:
- .:/celery_sample
redis:
image: redis
networks:
- webnet
celery:
image: celery:3.1.25
command: "celery worker -A my_celery -l INFO"
volumes:
- .:/celery_sample
networks:
- webnet
networks:
webnet:
requirements.txt
flask==0.10
redis
requests==2.11.1
celery==3.1.25
my_celery.py (kindly ignore the logic)
from flask import Flask
from celery import Celery
flask_app = Flask(__name__)
celery_app = Celery('my_celery')
celery_app.config_from_object('celeryconfig')
@celery_app.task
def add_celery():
return str(int(10)+int(40))
@flask_app.route('/')
def index():
return "Index Page"
@flask_app.route('/add')
def add_api():
add_celery.delay()
return "Added to Queue"
if __name__ == '__main__':
flask_app.debug = True
flask_app.run(host='0.0.0.0', port=8000)
celeryconfig.py
BROKER_URL = 'redis://localhost:6379/0'
CELERY_RESULT_BACKEND = 'redis://localhost:6379/0'