首页 > 解决方案 > Celery - Memory Leak (Memory doesn't get free even after worker completed the task)

问题描述

I have a atomic transaction running on celery server which consumes lot of memory but memory doesn't get free after task is completed.

Solution which worked for me is to kill the celery worker after N tasks i.e. to use - CELERYD_MAX_TASKS_PER_CHILD.

Is there any other solution to this problem? what should be good number to set for CELERYD_MAX_TASKS_PER_CHILD, If celery receives around 10,000 tasks per day

标签: pythondjangocelery

解决方案


There's an open issue on celery which may be worth checking out.

Your workaround is quite fair and is what we used in our own business and simply worked. The remarkable thing is that celery uses worker pools which means it doesn't kill worker processes after each task and reuses them for the next tasks, which obviously means process level resources can be leaked over time.

You can measure the time it takes for processes to start and die. For example if your tasks take 20s and it takes 2 seconds for the processes to start and finally die and you tolerate an overhead of 5% then you can set the CELERYD_MAX_TASKS_PER_CHILD parameter to 2. It depends on the amount of overhead and leak you can tolerate.


推荐阅读