python - Celery 任务未运行并卡在 PENDING 中
问题描述
我正在关注互联网上的各种教程之一,并使用 Docker/Docker Compose 设置了一个 Flask/RabbitMQ/Celery 应用程序。容器似乎都运行成功,但是当我到达端点时,应用程序停止了。该任务似乎被卡住PENDING
并且从未真正完成。Docker 输出中没有错误,所以我真的很困惑为什么这不起作用。当我到达终点时,我看到的唯一输出是:
rabbit_1 | 2021-05-13 01:38:07.942 [info] <0.760.0> accepting AMQP connection <0.760.0> (172.19.0.4:45414 -> 172.19.0.2:5672)
rabbit_1 | 2021-05-13 01:38:07.943 [info] <0.760.0> connection <0.760.0> (172.19.0.4:45414 -> 172.19.0.2:5672): user 'rabbitmq' authenticated and granted access to vhost '/'
rabbit_1 | 2021-05-13 01:38:07.952 [info] <0.776.0> accepting AMQP connection <0.776.0> (172.19.0.4:45416 -> 172.19.0.2:5672)
rabbit_1 | 2021-05-13 01:38:07.953 [info] <0.776.0> connection <0.776.0> (172.19.0.4:45416 -> 172.19.0.2:5672): user 'rabbitmq' authenticated and granted access to vhost '/'
我真的不确定我做错了什么,因为文档没有太大帮助。
Dockerfile
FROM python:3
COPY ./requirements.txt /app/requirements.txt
WORKDIR /app
RUN pip install -r requirements.txt
COPY . /app
ENTRYPOINT [ "python" ]
CMD ["app.py","--host=0.0.0.0"]
烧瓶app.py
from workerA import add_nums
from flask import (
Flask,
request,
jsonify,
)
app = Flask(__name__)
@app.route("/add")
def add():
first_num = request.args.get('f')
second_num = request.args.get('s')
result = add_nums.delay(first_num, second_num)
return jsonify({'result': result.get()}), 200
if __name__ == '__main__':
app.run(debug=True, host='0.0.0.0')
芹菜workerA.py
from celery import Celery
# Celery configuration
CELERY_BROKER_URL = 'amqp://rabbitmq:rabbitmq@rabbit:5672/'
CELERY_RESULT_BACKEND = 'rpc://'
# Initialize Celery
celery = Celery('workerA', broker=CELERY_BROKER_URL, backend=CELERY_RESULT_BACKEND)
@celery.task()
def add_nums(a, b):
return a + b
docker-compose.yml
version: "3"
services:
web:
build:
context: .
dockerfile: Dockerfile
restart: always
ports:
- "5000:5000"
depends_on:
- rabbit
volumes:
- .:/app
rabbit:
hostname: rabbit
image: rabbitmq:management
environment:
- RABBITMQ_DEFAULT_USER=rabbitmq
- RABBITMQ_DEFAULT_PASS=rabbitmq
ports:
- "5673:5672"
- "15672:15672"
worker_1:
build:
context: .
hostname: worker_1
entrypoint: celery
command: -A workerA worker --loglevel=info -Q workerA
volumes:
- .:/app
links:
- rabbit
depends_on:
- rabbit
解决方案
好吧,经过大量研究,我确定问题出在任务的队列名称上。Celery 使用队列的默认名称,这导致了一些问题。我像这样调整了我的装饰:
@celery.task(queue='workerA')
def add_nums(a, b):
return a + b
现在它起作用了!
推荐阅读
- django - Django:密码输入为空
- r - ggplot 棒棒糖图表的自定义端点
- php - 在 Sublime Text 3 中创建 php 标签时双“<”
- excel - For Each - 在数组中打开指定的工作簿
- spring - RabbitMQ - 多个消费者如何从单个队列中消费相同的消息?
- php - 如何添加序列号选择的选项值
- excel - 如何使用VBA将范围内的值从一张纸复制到另一张纸上并粘贴到另一张纸上?
- matlab - 如何使用模糊逻辑工具箱的命令行函数迭代修改隶属函数?
- pandas - csv的clumns在导入到SQLite后都聚在一起
- sql - 如何提高昂贵的表假脱机的查询性能