首页 > 解决方案 > 找不到“spark-submit2.cmd”

问题描述

> library('BBmisc')
> library('sparklyr')
> sc <- spark_connect(master = 'local')
Error in start_shell(master = master, spark_home = spark_home, spark_version = version,  : 
  Failed to find 'spark-submit2.cmd' under 'C:\Users\Owner\AppData\Local\spark\spark-3.0.0-bin-hadoop2.7', please verify - SPARK_HOME.
> spark_home_dir()
[1] "C:\\Users\\Owner\\AppData\\Local/spark/spark-3.0.0-bin-hadoop2.7"
> spark_installed_versions()
  spark hadoop                                                              dir
1 3.0.0    2.7 C:\\Users\\Owner\\AppData\\Local/spark/spark-3.0.0-bin-hadoop2.7
> spark_home_set()
Setting SPARK_HOME environment variable to C:\Users\Owner\AppData\Local/spark/spark-3.0.0-bin-hadoop2.7
> sc <- spark_connect(master = 'local')
Error in start_shell(master = master, spark_home = spark_home, spark_version = version,  : 
  Failed to find 'spark-submit2.cmd' under 'C:\Users\Owner\AppData\Local\spark\spark-3.0.0-bin-hadoop2.7', please verify - SPARK_HOME.

来源:https ://github.com/englianhu/binary.com-interview-question/issues/1#issue-733943885

我可以知道如何解决Failed to find 'spark-submit2.cmd' under 'C:\Users\Owner\AppData\Local\spark\spark-3.0.0-bin-hadoop2.7', please verify - SPARK_HOME.吗?

参考:需要帮助开始使用 Spark 和 sparklyr

标签: rapache-sparkr-package

解决方案


解决了 !!!

步 :

  1. https://spark.apache.org/downloads.html
  2. 将压缩文件解压缩到'C:/Users/scibr/AppData/Local/spark/spark-3.0.1-bin-hadoop3.2'.
  3. 手动选择最新版本:spark_home_set('C:/Users/scibr/AppData/Local/spark/spark-3.0.1-bin-hadoop3.2')

GitHub来源:https ://github.com/englianhu/binary.com-interview-question/issues/1#event-3968919946


推荐阅读