python - Airflow airflow.exceptions.AirflowException:无法创建远程临时文件 SSHExecuteOperator
问题描述
我正在尝试在 Airflow 中运行简单的 SSHExecutorOperator。
这是我的 .py 文件:
from airflow.contrib.hooks.ssh_hook import SSHHook
from datetime import timedelta
default_args = {
'owner': 'airflow',
'start_date':airflow.utils.dates.days_ago(2),
'retries': 3
}
dag = DAG('Nas_Hdfs', description='Simple tutorial DAG',
schedule_interval=None,default_args=default_args,
catchup=False)
sshHook = SSHHook(conn_id='101')
sshHook.no_host_key_check = True
t2 = SSHExecuteOperator(task_id="NAS_TO_HDFS_FILE_COPY",
bash_command="hostname ",
ssh_hook=sshHook,
dag=dag
)
t2
我收到以下错误:
ERROR - Failed to create remote temp file
这是完整的日志:
INFO - Subtask: --------------------------------------------------------------------------------
INFO - Subtask: Starting attempt 1 of 4
INFO - Subtask: --------------------------------------------------------------------------------
INFO - Subtask:
INFO - Subtask: [2018-05-28 08:54:22,812] {models.py:1342} INFO - Executing <Task(SSHExecuteOperator): NAS_TO_HDFS_FILE_COPY> on 2018-05-28 08:54:12.876538
INFO - Subtask: [2018-05-28 08:54:23,303] {models.py:1417} ERROR - Failed to create remote temp file
INFO - Subtask: Traceback (most recent call last):
INFO - Subtask: File "/opt/miniconda3/lib/python2.7/site-packages/airflow/models.py", line 1374, in run
INFO - Subtask: result = task_copy.execute(context=context)
INFO - Subtask: File "/opt/miniconda3/lib/python2.7/site-packages/airflow/contrib/operators/ssh_execute_operator.py", line 128, in execute
INFO - Subtask: self.task_id) as remote_file_path:
INFO - Subtask: File "/opt/miniconda3/lib/python2.7/site-packages/airflow/contrib/operators/ssh_execute_operator.py", line 64, in __enter__
INFO - Subtask: raise AirflowException("Failed to create remote temp file")
INFO - Subtask: AirflowException: Failed to create remote temp file
INFO - Subtask: [2018-05-28 08:54:23,304] {models.py:1433} INFO - Marking task as UP_FOR_RETRY
INFO - Subtask: [2018-05-28 08:54:23,342] {models.py:1462} ERROR - Failed to create remote temp file
INFO - Subtask: Traceback (most recent call last):
INFO - Subtask: File "/opt/miniconda3/bin/airflow", line 28, in <module>
INFO - Subtask: args.func(args)
INFO - Subtask: File "/opt/miniconda3/lib/python2.7/site-packages/airflow/bin/cli.py", line 422, in run
INFO - Subtask: pool=args.pool,
INFO - Subtask: File "/opt/miniconda3/lib/python2.7/site-packages/airflow/utils/db.py", line 53, in wrapper
INFO - Subtask: result = func(*args, **kwargs)
INFO - Subtask: File "/opt/miniconda3/lib/python2.7/site-packages/airflow/models.py", line 1374, in run
INFO - Subtask: result = task_copy.execute(context=context)
INFO - Subtask: File "/opt/miniconda3/lib/python2.7/site-packages/airflow/contrib/operators/ssh_execute_operator.py", line 128, in execute
INFO - Subtask: self.task_id) as remote_file_path:
INFO - Subtask: File "/opt/miniconda3/lib/python2.7/site-packages/airflow/contrib/operators/ssh_execute_operator.py", line 64, in __enter__
INFO - Subtask: raise AirflowException("Failed to create remote temp file")
INFO - Subtask: airflow.exceptions.AirflowException: Failed to create remote temp file
INFO - Task exited with return code 1
非常感谢任何帮助!
编辑:我在我的气流用户 python shell 中运行了它,这是输出:
from airflow.contrib.hooks.ssh_hook import SSHHook
sshHook = SSHHook(conn_id='101')
sshHook.no_host_key_check = True
sshHook.Popen(["-q", "mktemp", "--tmpdir", "tmp_XXXXXX"])
解决方案
确保遵循以下 3 个步骤:
- 使用 ssh 密钥代替密码
- “key_file”使用 id_rsa 文件而不是 id_rsa.pub
- 气流需要所有者和权限 0600 才能触摸 id_rsa 和 id_rsa.pub 文件
推荐阅读
- xml - 线程“主”javax.persistence.PersistenceException 中的异常:持久性.xml 无效。解析 XML 时出错(第 1 行:第 -1 列)
- c# - 如何在 C# 中返回泛型类对象?
- python - 根据特定的键和值从列表中删除重复的字典
- java - 如何使用正确的语法在 Oracle DB 中插入值
- javascript - 使用 onClick在 React中呈现的最简单方法?
- jquery - 为什么 Jquery 加载方法不起作用 laravel
- jupyter-notebook - Jupyter Notebook 单元停止工作
- python - np.r_ 不适用于 2 个范围或多个切片
- android - 键入'_InternalLinkedHashMap
' 不是类型转换中类型 'Session' 的子类型 - c - 当 stdin 和 stdout 都连接到命名管道时,是否可以防止应用程序挂起?