首页 > 解决方案 > 无法通过 sparklyr 连接到独立的 spark 集群。如何调试?

问题描述

我可以确认使用spark-shell例如连接到集群

spark-shell --master spark://myurl:7077

作品

library(sparklyr)

sc <- spark_connect(
  master="spark://myurl:7077",
  spark_home = "d:/spark/spark-2.4.4-bin-hadoop2.7/"
  )

没有并给出错误

Error in force(code) : 
  Failed while connecting to sparklyr to port (8880) for sessionid (59811): Gateway in localhost:8880 did not respond.
    Path: d:\spark\spark-2.4.4-bin-hadoop2.7\bin\spark-submit2.cmd
    Parameters: --class, sparklyr.Shell, "C:\Users\user1\Documents\R\win-library\3.6\sparklyr\java\sparklyr-2.3-2.11.jar", 8880, 59811
    Log: C:\Users\user1\AppData\Local\Temp\RtmpottVxI\file66ec13ea6ef0_spark.log


---- Output Log ----
Error: Could not create the Java Virtual Machine.
Error: A fatal exception has occurred. Program will exit.
Invalid maximum heap size: -Xmx10g
The specified size exceeds the maximum representable size.

标签: apache-sparksparklyr

解决方案


原来我需要安装 Java 8 JDK 而不是 JRE。


推荐阅读