We are developing a distributed application in Python + Celery for our task queue.
Our application requires that we download emails from a remote Internet provider via IMAP (for example, gmail), and we want this task to be performed in parallel. For this email account you are provided with a limited number of connections for simulation, so we need a way to atomically track our active connections for all downloadable accounts.
I found several examples of atomic locks for Celery using Redis, but not one of them that can track a pool of limited resources like this, and all attempts to implement our own, have led to difficult to debug race conditions, our locks are intermittently never released.
Nficano
source share