apache-spark - spark Listner OnApplicationEnd event stops only Driver, executors are not getting cleaned up
问题描述
I like to use sparklistner to end application execution, while I stop the application, it only stops the driver and it did not clean or stop the Executors.
whatever the Executors Added when I start the application using onApplicationStart Spark Listner should end both Driver and Executors, when I invoke a method onApplicaitonEnd, but actually it only terminates the driver and not the Executors
class testListner extends SparkListner {
override def onApplicationEnd(appEnded:SparkListnerApplicationEnd):Unit ={}
}
object slyest extends App {sc.addSparkListner(new testListner)}
解决方案
推荐阅读
- javascript - 如何使用 JS 渲染预加载的图像
- java - 保持 JMS 会话始终处于活动状态
- shell - 使用 unix 命令或 shell 脚本查找 2 个 csv 文件之间的差异
- python - python中列表列表的排序
- mysql - 将记录表与负关系映射表连接起来
- xml - xml 模式中属性的默认类型是什么?
- angular - Angular 2 - 复选框列表 - 如何检查复选框的状态
- rx-java2 - Rxjava : Schedulers.io() 与 Schedulers.from(scheduledExecutorService)
- c# - 如何删除除某些文件夹以外的所有文件夹
- c# - 我在桌子附近遇到语法错误,它说不知道到底需要做什么