r - 找不到“spark-submit2.cmd”
问题描述
> library('BBmisc')
> library('sparklyr')
> sc <- spark_connect(master = 'local')
Error in start_shell(master = master, spark_home = spark_home, spark_version = version, :
Failed to find 'spark-submit2.cmd' under 'C:\Users\Owner\AppData\Local\spark\spark-3.0.0-bin-hadoop2.7', please verify - SPARK_HOME.
> spark_home_dir()
[1] "C:\\Users\\Owner\\AppData\\Local/spark/spark-3.0.0-bin-hadoop2.7"
> spark_installed_versions()
spark hadoop dir
1 3.0.0 2.7 C:\\Users\\Owner\\AppData\\Local/spark/spark-3.0.0-bin-hadoop2.7
> spark_home_set()
Setting SPARK_HOME environment variable to C:\Users\Owner\AppData\Local/spark/spark-3.0.0-bin-hadoop2.7
> sc <- spark_connect(master = 'local')
Error in start_shell(master = master, spark_home = spark_home, spark_version = version, :
Failed to find 'spark-submit2.cmd' under 'C:\Users\Owner\AppData\Local\spark\spark-3.0.0-bin-hadoop2.7', please verify - SPARK_HOME.
来源:https ://github.com/englianhu/binary.com-interview-question/issues/1#issue-733943885
我可以知道如何解决Failed to find 'spark-submit2.cmd' under 'C:\Users\Owner\AppData\Local\spark\spark-3.0.0-bin-hadoop2.7', please verify - SPARK_HOME.
吗?
解决方案
解决了 !!!
步 :
- https://spark.apache.org/downloads.html
- 将压缩文件解压缩到
'C:/Users/scibr/AppData/Local/spark/spark-3.0.1-bin-hadoop3.2'
. - 手动选择最新版本:
spark_home_set('C:/Users/scibr/AppData/Local/spark/spark-3.0.1-bin-hadoop3.2')
GitHub来源:https ://github.com/englianhu/binary.com-interview-question/issues/1#event-3968919946
推荐阅读
- python - docker上的Python打印到标准输出
- javascript - 使用 react-redux-toolkit 在 React 中获取未定义的 useSelector
- delphi - 没有发送者的呼叫过程
- c++ - 高效的样条多边形光栅化
- json - 使用 jpath 从 Json 获取值
- javascript - 如何匹配 JavaScript 中的文件名模式?
- r - 为什么“情节”不显示线?
- kubernetes-helm - 如果 helm 模板返回条件检查值
- javascript - React TableCell 没有在嵌套循环中填充
- javascript - 使用 JavaScript/HTML 上传文件