python - Celery - Memory Leak (Memory doesn't get free even after worker completed the task)
问题描述
I have a atomic transaction running on celery server which consumes lot of memory but memory doesn't get free after task is completed.
Solution which worked for me is to kill the celery worker after N tasks i.e. to use - CELERYD_MAX_TASKS_PER_CHILD.
Is there any other solution to this problem? what should be good number to set for CELERYD_MAX_TASKS_PER_CHILD, If celery receives around 10,000 tasks per day
解决方案
There's an open issue on celery which may be worth checking out.
Your workaround is quite fair and is what we used in our own business and simply worked. The remarkable thing is that celery uses worker pools which means it doesn't kill worker processes after each task and reuses them for the next tasks, which obviously means process level resources can be leaked over time.
You can measure the time it takes for processes to start and die. For example if your tasks take 20s and it takes 2 seconds for the processes to start and finally die and you tolerate an overhead of 5% then you can set the CELERYD_MAX_TASKS_PER_CHILD
parameter to 2. It depends on the amount of overhead and leak you can tolerate.
推荐阅读
- javascript - 条纹firebase api没有创建订阅
- javascript - 如何实时流式传输推文以不和谐?
- python - 数组元素的 Numpy 内存分配
- java - 创建简单的休眠应用程序时出现“表不存在”错误
- java - 用于读取 GPIO 引脚状态变化的 Java epoll 选择器
- python - pyparsing - 如何提前查找不明确的语法(带有/不带时区解析的时间戳)
- java - 遍历 HashMap 中的值并查看特定 HashSet 出现次数的算法
- powerbi - DAX - 如何根据索引计算两个表中缺少的行
- node.js - 在节点 js 中导入
- javascript - 使用 document.createElement 测试该元素的 scrollHeight 和 clientHeight 时,值始终为 0