首页 > 解决方案 > 由于python,气流运行python文件失败:无法打开文件

问题描述

我的文件夹中有这样的树project

我在 docker 容器中创建了一个气流服务:

dockerfile

#Base image
FROM puckel/docker-airflow:1.10.1

#Impersonate
USER root

#Los automatically thrown to the I/O strem and not buffered.
ENV PYTHONUNBUFFERED 1

ENV AIRFLOW_HOME=/usr/local/airflow
ENV PYTHONPATH "${PYTHONPATH}:/libraries"

WORKDIR /
#Add docker source files to the docker machine
ADD ./docker_resources ./docker_resources
#Install libraries and dependencies
RUN apt-get update && apt-get install -y vim
RUN pip install --user psycopg2-binary
RUN pip install -r docker_resources/requirements.pip


Docker-compose.yml
version: '3'
services:
  postgres:
    image: postgres:9.6
    container_name: "postgres"
    environment:
      - POSTGRES_USER=airflow
      - POSTGRES_PASSWORD=airflow
      - POSTGRES_DB=airflow
    ports:
      - "5432:5432"
  webserver:
    build: .
    restart: always
    depends_on:
      - postgres
    volumes:
      - ./dags:/usr/local/airflow/dags
      - ./libraries:/libraries
      - ./python_scripts:/python_scripts
    ports:
      - "8080:8080"
    command: webserver
    healthcheck:
      test: ["CMD-SHELL", "[ -f /usr/local/airflow/airflow-webserver.pid ]"]
      interval: 30s
      timeout: 30s
      retries: 3
  scheduler:
    build: .
    restart: always
    depends_on:
      - postgres
    volumes:
      - ./dags:/usr/local/airflow/dags
      - ./logs:/usr/local/airflow/logs
    ports:
      - "8793:8793"
    command: scheduler
    healthcheck:
      test: ["CMD-SHELL", "[ -f /usr/local/airflow/airflow-scheduler.pid ]"]
      interval: 30s
      timeout: 30s
      retries: 3

我的 dag 文件夹有一个教程:

from datetime import timedelta
# The DAG object; we'll need this to instantiate a DAG
from airflow import DAG
# Operators; we need this to operate!
from airflow.operators.bash_operator import BashOperator
from airflow.utils.dates import days_ago
# These args will get passed on to each operator
# You can override them on a per-task basis during operator initialization
default_args = {
    'owner': 'airflow',
    'depends_on_past': False,
    'start_date': days_ago(2),
    'email': ['xxx@xxx.com '],
    'email_on_failure': False,
    'email_on_retry': False,
    'retries': 0,
    'retry_delay': timedelta(minutes=5),
    'schedule_interval': '@daily',
}

dag = DAG(
    'Tutorial',
    default_args=default_args,
    description='A simple tutorial DAG with production tables',
    catchup=False
)

task_1 = BashOperator(
    task_id='my_task',
    bash_command='python /python_scripts/my_script.py',
    dag=dag,
)

我尝试更改bash_command='python /python_scripts/my_script.py',

他们都失败了。我尝试了它们,因为在文件夹BashOperator中运行命令。tmp如果我进入机器并运行ls命令,我会在python_scripts. 即使我逃避它也有效python /python_scripts/my_script.py/usr/local/airflow

错误总是:

信息 - python:无法打开文件

我搜索并且人们用绝对路径解决了这个问题,但我无法解决它。

编辑ADD ./ ./如果我在下面 添加的 dockerfile中WORKDIR / 并从以下位置删除这些卷docker-compose.yml

 1. ./libraries:/libraries

 2. ./python_scripts:/python_scripts

错误不是找不到文件,而是找不到库。Import module error. 这是一个改进,但没有意义,原因PYTHONPATH被定义为具有/libraries文件夹。

使ADD语句的体积更有意义,因为我需要将更改应用到代码中立即应用到 docker 中。

编辑 2: 已安装卷,但容器文件夹中没有文件,这就是找不到文件的原因。运行 Add ./ ./ 时,文件夹中有文件,因为在文件夹中添加了所有文件。尽管它不起作用,但也找不到库。

标签: pythondockerairflowairflow-scheduler

解决方案


你试过了吗

bash_command='python /usr/local/airflow/python_scripts/my_script.py' 

您必须检查文件夹是否具有良好的权限(为您的用户访问和执行)


推荐阅读