首页 > 解决方案 > SJS 0.9.0 中添加的 CONTEXT_ID 在表中设置为 null

问题描述

我试图在我的应用程序中赶上新的 SJS 0.9.0。创建上下文后,我正在尝试提交作业->发生这种情况

19/04/10 21:45:06 ERROR JobDAOActor: About to restart actor due to exception:
org.postgresql.util.PSQLException: ERROR: null value in column "CONTEXT_ID" violates not-null constraint
  Detail: Failing row contains (c144684e-3dad-459d-acff-2ac353709092, SJS_512_MB_shared_prioritized@1, 1, spark.jobserver.DPAASExecutor, 2019-04-10 21:45:06.602, null, null, null, null, null, null).
    at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2270)
    at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:1998)
    at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:255)
    at org.postgresql.jdbc2.AbstractJdbc2Statement.execute(AbstractJdbc2Statement.java:570)
    at org.postgresql.jdbc2.AbstractJdbc2Statement.executeWithFlags(AbstractJdbc2Statement.java:420)
    at org.postgresql.jdbc2.AbstractJdbc2Statement.executeUpdate(AbstractJdbc2Statement.java:366)
    at slick.driver.JdbcActionComponent$InsertActionComposerImpl$InsertOrUpdateAction$$anonfun$nativeUpsert$1.apply(JdbcActionComponent.scala:560)
    at slick.driver.JdbcActionComponent$InsertActionComposerImpl$InsertOrUpdateAction$$anonfun$nativeUpsert$1.apply(JdbcActionComponent.scala:557)
    at slick.jdbc.JdbcBackend$SessionDef$class.withPreparedStatement(JdbcBackend.scala:347)
    at slick.jdbc.JdbcBackend$BaseSession.withPreparedStatement(JdbcBackend.scala:407)
    at slick.driver.JdbcActionComponent$InsertActionComposerImpl.preparedInsert(JdbcActionComponent.scala:498)
    at slick.driver.JdbcActionComponent$InsertActionComposerImpl$InsertOrUpdateAction.nativeUpsert(JdbcActionComponent.scala:557)
    at slick.driver.JdbcActionComponent$InsertActionComposerImpl$InsertOrUpdateAction.f$1(JdbcActionComponent.scala:540)
    at slick.driver.JdbcActionComponent$InsertActionComposerImpl$InsertOrUpdateAction.run(JdbcActionComponent.scala:545)
    at slick.driver.JdbcActionComponent$SimpleJdbcDriverAction.run(JdbcActionComponent.scala:32)
    at slick.driver.JdbcActionComponent$SimpleJdbcDriverAction.run(JdbcActionComponent.scala:29)
    at slick.backend.DatabaseComponent$DatabaseDef$$anon$2.liftedTree1$1(DatabaseComponent.scala:237)
    at slick.backend.DatabaseComponent$DatabaseDef$$anon$2.run(DatabaseComponent.scala:237)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)

https://github.com/spark-jobserver/spark-jobserver/pull/1058。这是 SJS 0.9.0 中添加的功能

你能解释一下为什么会这样吗?我是否应该在提交作业时包含任何额外的道具,例如 CONTEXT_ID,因为这仅在 SJS 0.9.0 中添加,并且我已经浏览了此内容https://github.com/spark-jobserver/spark-jobserver/releases/tag /v0.9.0 .. 还是 spark 负责 CONTEXT_ID 的事情?

标签: apache-sparkspark-jobserver

解决方案


这实际上是因为,我们在代码中使用了早期版本的 SJS (0.8.0) jar,并且仅在 job_server 目录中使用了 SJS 0.9.0 jar。它使用了仅在 0.8.0 而不是 0.9.0 中需要的 manager_start.sh。用 0.9.0 替换它工作正常!


推荐阅读