首页 > 解决方案 > 在启用了 spark 身份验证和加密的 spark 独立集群上,无法通过 spark-submit 杀死 spark 应用程序

问题描述

在启用了 spark 身份验证和加密的 spark 独立集群上,我无法通过 spark 提交命令杀死 spark 应用程序。命令-

bin/spark-class org.apache.spark.deploy.Client kill spark://host:7077 driver-20200728102235-0005. 
Getting error: Exception in thread "main" org.apache.spark.SparkException: Exception thrown in awaitResult:
        at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:226)
        at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)
        at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:101)
        at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:109)
        at org.apache.spark.deploy.ClientApp$$anonfun$7.apply(Client.scala:243)
        at org.apache.spark.deploy.ClientApp$$anonfun$7.apply(Client.scala:243)
        at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
        at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
        at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
        at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186)
        at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
        at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:186)
        at org.apache.spark.deploy.ClientApp.start(Client.scala:243)
        at org.apache.spark.deploy.Client$.main(Client.scala:225)
        at org.apache.spark.deploy.Client.main(Client.scala)
Caused by: java.lang.RuntimeException: java.lang.IllegalArgumentException: Unknown challenge message.
        at org.apache.spark.network.crypto.AuthRpcHandler.receive(AuthRpcHandler.java:109)
        at org.apache.spark.network.server.TransportRequestHandler.processRpcRequest(TransportRequestHandler.java:180)
        at org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:103)
        at org.apache.spark.network.server.TransportChannelHandler.channelRead(TransportChannelHandler.java:118)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
        at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)
        at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:286)```

标签: apache-sparkpyspark

解决方案


在您的 spark 客户端上,尝试设置此值:-Dspark.authenticate=true -Dspark.network.crypto.enabled=true在名为 的环境变量spark.executor.extraJavaOptions中,spark.executorEnv.JAVA_TOOL_OPTIONS依此类推。

另请检查您的密码是否存储在spark.executorEnv._SPARK_AUTH_SECRET环境变量中。

如果它不起作用,我建议将您的 spark-submit 配置添加到此主题。


推荐阅读