首页 > 解决方案 > Dockerfile 无法运行 cp 命令在容器内移动文件

问题描述

您好我正在尝试下载容器内的文件并将此文件移动到容器内的特定位置。

RUN wget https://storage.googleapis.com/hadoop-lib/gcs/gcs-connector-latest-hadoop2.jar 
RUN cp gcs-connector-latest-hadoop2.jar /opt/spark-2.2.1-bin-hadoop2.7/jars/
RUN cp /opt/spark-2.2.1-bin-hadoop2.7/conf/spark-defaults.conf.template /opt/spark-2.2.1-bin-hadoop2.7/conf/spark-defaults.conf
RUN echo "spark.hadoop.google.cloud.auth.service.account.enable true" > /opt/spark-2.2.1-bin-hadoop2.7/conf/spark-defaults.conf

但这失败了,错误如下:

Step 44/46 : RUN cp gcs-connector-latest-hadoop2.jar /opt/spark-2.2.1-bin-hadoop2.7/jars/
 ---> Running in 8c81d9871377
cp: cannot create regular file '/opt/spark-2.2.1-bin-hadoop2.7/jars/': No such file or directory
The command '/bin/sh -c cp gcs-connector-latest-hadoop2.jar /opt/spark-2.2.1-bin-hadoop2.7/jars/' returned a non-zero code: 1

EDIT-1 错误截图

我已经尝试过提到的解决方案,现在我收到以下错误:

删除中间容器 e885431017e8 步骤 43/44:复制 /opt/spark-2.2.1-bin-hadoop2.7/conf/spark-defaults.conf.template /opt/spark-2.2.1-bin-hadoop2.7/conf /spark-defaults.conf lstat opt/spark-2.2.1-bin-hadoop2.7/conf/spark-defaults.conf.template:没有这样的文件或目录

标签: dockerdockerfile

解决方案


/opt/spark-2.2.1-bin-hadoop2.7/jars/你的容器中已经有路径了吗?

如果不在cp命令前添加:

mkdir -p /opt/spark-2.2.1-bin-hadoop2.7/jars/

然后尝试像这样复制:

cp gcs-connector-latest-hadoop2.jar /opt/spark-2.2.1-bin-hadoop2.7/jars/gcs-connector-latest-hadoop2.jar

编辑后:

您运行mkdir并尝试从中复制,因为文件夹是空的,所以这不应该工作!


推荐阅读