首页 > 解决方案 > Celery 任务未运行并卡在 PENDING 中

问题描述

我正在关注互联网上的各种教程之一,并使用 Docker/Docker Compose 设置了一个 Flask/RabbitMQ/Celery 应用程序。容器似乎都运行成功,但是当我到达端点时,应用程序停止了。该任务似乎被卡住PENDING并且从未真正完成。Docker 输出中没有错误,所以我真的很困惑为什么这不起作用。当我到达终点时,我看到的唯一输出是:

rabbit_1    | 2021-05-13 01:38:07.942 [info] <0.760.0> accepting AMQP connection <0.760.0> (172.19.0.4:45414 -> 172.19.0.2:5672)
rabbit_1    | 2021-05-13 01:38:07.943 [info] <0.760.0> connection <0.760.0> (172.19.0.4:45414 -> 172.19.0.2:5672): user 'rabbitmq' authenticated and granted access to vhost '/'
rabbit_1    | 2021-05-13 01:38:07.952 [info] <0.776.0> accepting AMQP connection <0.776.0> (172.19.0.4:45416 -> 172.19.0.2:5672)
rabbit_1    | 2021-05-13 01:38:07.953 [info] <0.776.0> connection <0.776.0> (172.19.0.4:45416 -> 172.19.0.2:5672): user 'rabbitmq' authenticated and granted access to vhost '/'

我真的不确定我做错了什么,因为文档没有太大帮助。

Dockerfile

FROM python:3
COPY ./requirements.txt /app/requirements.txt
WORKDIR /app
RUN pip install -r requirements.txt
COPY . /app
ENTRYPOINT [ "python" ]
CMD ["app.py","--host=0.0.0.0"]

烧瓶app.py

from workerA import add_nums
from flask import (
   Flask,
   request,
   jsonify,
)
app = Flask(__name__)


@app.route("/add")
def add():
    first_num = request.args.get('f')
    second_num = request.args.get('s')
    result = add_nums.delay(first_num, second_num)
    return jsonify({'result': result.get()}), 200



if __name__ == '__main__':
    app.run(debug=True, host='0.0.0.0')

芹菜workerA.py

from celery import Celery
# Celery configuration
CELERY_BROKER_URL = 'amqp://rabbitmq:rabbitmq@rabbit:5672/'
CELERY_RESULT_BACKEND = 'rpc://'
# Initialize Celery
celery = Celery('workerA', broker=CELERY_BROKER_URL, backend=CELERY_RESULT_BACKEND)


@celery.task()
def add_nums(a, b):
   return a + b

docker-compose.yml

version: "3"
services:
  web:
    build:
      context: .
      dockerfile: Dockerfile
    restart: always
    ports:
      - "5000:5000"
    depends_on:
      - rabbit
    volumes:
      - .:/app
  rabbit:
    hostname: rabbit
    image: rabbitmq:management
    environment:
      - RABBITMQ_DEFAULT_USER=rabbitmq
      - RABBITMQ_DEFAULT_PASS=rabbitmq
    ports:
      - "5673:5672"
      - "15672:15672"
  worker_1:
    build:
      context: .
    hostname: worker_1
    entrypoint: celery
    command: -A workerA worker --loglevel=info -Q workerA
    volumes:
      - .:/app
    links:
      - rabbit
    depends_on:
      - rabbit

标签: pythondockerflaskrabbitmqcelery

解决方案


好吧,经过大量研究,我确定问题出在任务的队列名称上。Celery 使用队列的默认名称,这导致了一些问题。我像这样调整了我的装饰:

@celery.task(queue='workerA')
def add_nums(a, b):
   return a + b

现在它起作用了!


推荐阅读