首页 > 解决方案 > Hadoop 与 openjdk:start-dfs.sh 错误(SSH?)

问题描述

我在按照本教程设置 4 集群 hadoop 架构时遇到了问题。我有以下 4 台机器(虚拟化):

我在主节点上设置了我的所有 conf 文件,并使用 scp 将它们导出到其他文件。主节点可以通过 ssh 访问从节点。我在所有机器上的 .bashrc 中设置了 JAVA_HOME。但是,这就是我得到的:

hadoop@master-node:~$ start-dfs.sh
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.hadoop.security.authentication.util.KerberosUtil (file:/home/hadoop/hadoop/share/hadoop/common/lib/hadoop-auth-2.8.4.jar) to method sun.security.krb5.Config.getInstance()
WARNING: Please consider reporting this to the maintainers of org.apache.hadoop.security.authentication.util.KerberosUtil
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
Starting namenodes on [node-master]
node-master: ssh: connect to host node-master port 22: Connection timed out
node1: Error: JAVA_HOME is not set and could not be found.
node2: Error: JAVA_HOME is not set and could not be found.
node3: Error: JAVA_HOME is not set and could not be found.
Starting secondary namenodes [0.0.0.0]
hadoop@0.0.0.0's password: 
0.0.0.0: Error: JAVA_HOME is not set and could not be found.
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.hadoop.security.authentication.util.KerberosUtil (file:/home/hadoop/hadoop/share/hadoop/common/lib/hadoop-auth-2.8.4.jar) to method sun.security.krb5.Config.getInstance()
WARNING: Please consider reporting this to the maintainers of org.apache.hadoop.security.authentication.util.KerberosUtil
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release

[3 种可能性] 使用 openJDK 11 似乎存在问题,尽管我不太确定这是造成这种混乱的原因。这些错误表明 ssh 存在问题,但 i)我上传了我的 conf 文件没有任何问题,并且 ii)我可以从主节点访问所有节点。这可能与设置 JAVA_HOME 路径的方式有关吗?这是我的 .bashrc 的结尾:

export JAVA_HOME=/usr/lib/jvm/java-11-openjdk-amd64
export PATH=PATH:$PATH/bin

提前感谢每一个线索(我不怎么用java,我觉得有点迷失在这里)

[编辑] 与 OracleJDK8 相同

hadoop@master-node:~$  readlink -f /usr/bin/java
/usr/lib/jvm/java-8-oracle/jre/bin/java
hadoop@master-node:~$ export JAVA_HOME=/usr/lib/jvm/java-8-oracle/jre
hadoop@master-node:~$ start-dfs.sh
Starting namenodes on [node-master]
node-master: ssh: connect to host node-master port 22: Connection timed out
node1: Error: JAVA_HOME is not set and could not be found.
node3: Error: JAVA_HOME is not set and could not be found.
node2: Error: JAVA_HOME is not set and could not be found.
Starting secondary namenodes [0.0.0.0]
hadoop@0.0.0.0's password: 

0.0.0.0:错误:未设置 JAVA_HOME 且无法找到。

标签: javahadoopssh

解决方案


你可以像导出路径一样,

export JAVA_HOME=/usr/lib/jvm/java-11-openjdk-amd64
export PATH=$PATH:$JAVA_HOME/bin

然后您必须执行以下命令以确保您的 PATH 包含 JAVA_HOME 变量。在 .bashrc 文件中附加 JAVA 和 PATH 变量后,执行以下命令,

source ~/.bashrc

然后检查echo $PATH,如果该值包含 JAVA_HOME 值,那么它应该可以工作。


推荐阅读