java - 安装后尝试打开 spark 报错:找不到任何与版本“1.8”匹配的 JVM
问题描述
描述:
在使用Homebrew之后,我在 MacBook 上安装了 spark 。我按照以下说明进行操作:https ://www.tutorialkart.com/apache-spark/how-to-install-spark-on-mac-os/ 。
一步一步的过程包括安装 Java,然后是 Scala,然后是 Spark。Java 和 Scala 安装成功。Spark 也安装成功。
当我尝试使用以下输入命令验证 spark 安装时,我遇到了错误。
输入指令: spark-shell
预期行为:期望 Spark 在终端上启动
实际行为:我收到以下错误:
Unable to find any JVMs matching version "1.8".
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform (file:/usr/local/Cellar/apache-spark/2.4.5/libexec/jars/spark-unsafe_2.11-2.4.5.jar) to method java.nio.Bits.unaligned()
WARNING: Please consider reporting this to the maintainers of org.apache.spark.unsafe.Platform
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
Exception in thread "main" java.lang.ExceptionInInitializerError
at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:80)
at org.apache.hadoop.security.SecurityUtil.getAuthenticationMethod(SecurityUtil.java:611)
at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:273)
at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:261)
at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:791)
at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:761)
at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:634)
at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2422)
at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2422)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.util.Utils$.getCurrentUserName(Utils.scala:2422)
at org.apache.spark.SecurityManager.<init>(SecurityManager.scala:79)
at org.apache.spark.deploy.SparkSubmit.secMgr$lzycompute$1(SparkSubmit.scala:348)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$secMgr$1(SparkSubmit.scala:348)
at org.apache.spark.deploy.SparkSubmit$$anonfun$prepareSubmitEnvironment$7.apply(SparkSubmit.scala:356)
at org.apache.spark.deploy.SparkSubmit$$anonfun$prepareSubmitEnvironment$7.apply(SparkSubmit.scala:356)
at scala.Option.map(Option.scala:146)
at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:355)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:774)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:161)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:184)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:920)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:929)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.StringIndexOutOfBoundsException: begin 0, end 3, length 2
at java.base/java.lang.String.checkBoundsBeginEnd(String.java:3756)
at java.base/java.lang.String.substring(String.java:1902)
at org.apache.hadoop.util.Shell.<clinit>(Shell.java:52)
我尝试了什么:
我尝试使用以下命令更改 JAVA_HOME:
export JAVA_HOME=/usr/local/opt/java
以前的 JAVA_HOME 路径是/opt/anaconda3
. 我可以看到 JAVA_HOME 已更改为usr/local/opt/java
.
我仍然收到错误消息。感谢您的回答/反馈。谢谢!!!
解决方案
请按照以下步骤操作 macOS 第 1 步。安装 java 8,因为 Spark 2.2 及更高版本需要 java8。请参阅Spark 文档!详情。
酿造安装 openjdk@8
然后设置更新java路径
export JAVA_HOME=/usr/local/opt/openjdk@8/libexec/openjdk.jdk/Contents/Home
推荐阅读
- php - 如何在同一个域下拥有一个 wordpress 页面和一个 windows server web 应用程序?
- powershell - 如何使用 Powershell 将哈希表与另一个哈希表进行比较?
- maven - “mvn spring-boot:run”有效,但执行 Spring Boot fat jar 失败
- javascript - javascript附加div并将其宽度设置为与锚文本相同
- jquery - 所有集合返回函数都支持隐式循环吗?
- .net - 具有固定块的正则表达式
- hyperledger-composer - 在作曲家休息服务器中,PUT 请求抛出无法读取多用户模式下未定义的属性“productId”
- linux - 如何从外部网络连接到 PyDev 调试器?
- c# - .NET XDocument 处理问题
- documentum - 如何获取文档服务器上所有存储库名称的列表