scala - 运行火花代码时如何解决“无法分配请求的地址:服务'sparkDriver'在重试16次后失败”?
问题描述
我正在用 intelliJ 学习 spark + scala,从下面的一小段代码开始
import org.apache.spark.{SparkConf, SparkContext}
object ActionsTransformations {
def main(args: Array[String]): Unit = {
//Create a SparkContext to initialize Spark
val conf = new SparkConf()
conf.setMaster("local")
conf.setAppName("Word Count")
val sc = new SparkContext(conf)
val numbersList = sc.parallelize(1.to(10000).toList)
println(numbersList)
}
}
尝试运行时,低于异常
Exception in thread "main" java.net.BindException: Can't assign requested address: Service 'sparkDriver' failed after 16 retries (on a random free port)! Consider explicitly setting the appropriate binding address for the service 'sparkDriver' (for example spark.driver.bindAddress for SparkDriver) to the correct binding address.
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:433)
at sun.nio.ch.Net.bind(Net.java:425)
at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:127)
at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:501)
at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1218)
at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:496)
at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:481)
at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:965)
at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:210)
at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:353)
at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:399)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:446)
at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131)
at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
at java.lang.Thread.run(Thread.java:745)
Process finished with exit code 1
任何人都可以建议做什么。
解决方案
在 spark/bin 目录下的 load-spark-env.sh 文件中添加 SPARK_LOCAL_IP
导出 SPARK_LOCAL_IP="127.0.0.1"
推荐阅读
- python - 在留下一些灰色污迹的白色背景(粘性陷阱)上检测苍蝇的轮廓
- c++ - C++ 不会打印尾随空格
- javascript - 未捕获的 SyntaxError:无效或意外的令牌 javaScript
- r - 用循环记录转换R中的许多变量
- redux-observable - 如何从 redux-observables 中的史诗访问调度功能
- powershell - 有没有办法让 VS Code 不替换未知的文本字符?
- c++ - 自 C++17 以来的纯右值差异中的直接列表初始化与复制列表初始化与复制初始化的示例
- c# - 搜索带有变量的句子的段落
- angular - 可观察到的角度测试开关语句
- c# - 从通用实体获取类型