Celery starts with the command below
celery -A myproject worker --loglevel=debug --concurrency=3 -Q testqueue
myproject.py as part of the main process, made some queries against the mysql database before deploying workflows.
As part of the query flow in the main django process, ORM creates a sqlalchemy connection pool if it does not already exist. Then workflows are created.
Celery as part of django patches closes existing connections.
def close_database(self, **kwargs): if self._close_old_connections: return self._close_old_connections()
In fact, what can happen is that the sqlalchemy pool object with one unused db connection is copied into 3 workflows when it is forked. Thus, 3 different pools have 3 connection objects that point to the same file description file.
Workers when performing tasks when requesting a connection to db, all workers receive the same unused connection from the sqlalchemy pool, since it is not currently used. The fact that all connections point to the same file descriptor made the MySQL connection disappear.
The new connections created after this are all new and do not point to the same socket file descriptor.
Decision:
In the main process, add
from django.db import connection connection.cursor()
before performing any import. those. before another djorm-ext-pool module is added.
This way all db requests will use the connection created by django outside the pool. When the django celery fixation closes the connection, the connection actually closes, and does not return to the alchemy pool, leaving the alchemy pool without any connections in it while referring to all workers when split. After workers request a db connection, sqlalchemy returns one of the newly created connections.
Venkat kotra
source share