首页 > 解决方案 > 使用 s3 jar 时的 Livy 超时

问题描述

当使用 REST 请求打开来自 s3 的 jar 的 livy 会话时出现错误,我检查了 s3 登录名和端点,它们很好,我不明白为什么这不起作用

livy_1          | Creating Interactive session 2: [owner: null, request: [kind: spark, proxyUser: None, jars: s3://jars, conf: spark.hadoop.fs.s3.impl -> org.apache.hadoop.fs.s3a.S3AFileSystem,spark.hadoop.fs.s3a.path.style.access -> true,spark.app.name -> devMode-session,spark.hadoop.fs.s3a.secret.key -> Key,spark.driver.memory -> 512m,spark.hadoop.fs.s3a.access.key -> access,spark.driver.cores -> 1,spark.master -> local[1],spark.hadoop.fs.s3a.block.size -> 512M,spark.hadoop.fs.s3a.connexion.maximum -> 4000,spark.hadoop.fs.s3a.endpoint -> http://endpoint, heartbeatTimeoutInSecond: 0]]
livy_1          | Connected to the port 10000
livy_1          | Your hostname, 136e925e003d, resolves to a loopback address, but we couldn't find any external IP address!
livy_1          | Set livy.rsc.rpc.server.address if you need to bind to another address.
livy_1          | Registering new session 2
livy_1          | Registered new session 2
livy_1          | 21/04/20 15:58:32  org.apache.hadoop.util.NativeCodeLoader WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
livy_1          | 21/04/20 15:58:32  org.apache.hadoop.metrics2.impl.MetricsConfig WARN MetricsConfig: Cannot locate configuration: tried hadoop-metrics2-s3a-file-system.properties,hadoop-metrics2.properties
livy_1          | Waiting thread interrupted, killing child process.
livy_1          | Failed to connect to context.
livy_1          | Caused by: java.io.IOException: RSCClient instance stopped.
livy_1          |   at org.apache.livy.rsc.RSCClient.stop(RSCClient.java:244)
livy_1          |   at org.apache.livy.rsc.RSCClient.connectionError(RSCClient.java:155)
livy_1          |   at org.apache.livy.rsc.RSCClient.access$300(RSCClient.java:51)
livy_1          |   at org.apache.livy.rsc.RSCClient$1.onFailure(RSCClient.java:95)
livy_1          |   at org.apache.livy.rsc.Utils$2.operationComplete(Utils.java:108)
livy_1          |   at io.netty.util.concurrent.DefaultPromise.notifyListener0(DefaultPromise.java:577)
livy_1          |   at io.netty.util.concurrent.DefaultPromise.notifyListeners0(DefaultPromise.java:570)
livy_1          |   at io.netty.util.concurrent.DefaultPromise.notifyListenersNow(DefaultPromise.java:549)
livy_1          |   at io.netty.util.concurrent.DefaultPromise.access$200(DefaultPromise.java:35)
livy_1          |   at io.netty.util.concurrent.DefaultPromise$1.run(DefaultPromise.java:501)
livy_1          |   at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:164)
livy_1          |   at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:472)
livy_1          |   at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:500)
livy_1          |   at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:989)
livy_1          |   at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
livy_1          |   ... 1 more
livy_1          | Failing pending job 4576508d-681e-4e07-8c4c-8182078b6194 due to shutdown.
livy_1          | Failed to ping RSC driver for session 2. Killing application.
livy_1          | Stopping InteractiveSession 2...
livy_1          | job was killed by user
livy_1          | Stopped InteractiveSession 2.
livy_1          | Fail to get rsc uri

标签: scalaapache-sparklivy

解决方案


推荐阅读