We have a simple task working with django celery on Heroku. Sort of:
@task Simple_task(): for line in csv.reader(origin): process_line(line) process_line(line): fields = parse_line(line) reg = Model1()
Where the origin is the csv file. When the file is large (more than 50,000 lines), the task takes up all the memory, which leads to R14 errors until it is canceled by the system (with 150% of the available memory 512 MB). The memory is never released, and we must restart the task manually.
Running on a Linux machine or with wizards on a development machine without any problems (all 170,000 lines). It seems that the memory leak is ONLY on Heroku. By the way, we start with DEBUG = False.
Did something break with Herokuβs celery task? Anything we can't lose? This became a demo congestion when deploying to Heroku.
Any help would be greatly appreciated.
source share