首页 > 解决方案 > 从 Spark 连接到 Postgres 时出现 NullpointerException - 为什么?

问题描述

object App {
  def main(args: Array[String]) {
    val conf = new spark.SparkConf().setMaster("local[2]").setAppName("mySparkApp")
    val sc = new spark.SparkContext(conf)
    val sqlContext = new SQLContext(sc)

    val jdbcUrl = "1.2.34.567" 
    val jdbcUser = "someUser"
    val jdbcPassword = "xxxxxxxxxxxxxxxxxxxx"
    val tableName = "myTable"
    val driver = "org.postgresql.Driver"
    Class.forName(driver)

    val df = sqlContext
            .read
            .format("jdbc")
            .option("driver", driver)
            .option("url", jdbcUrl)
            .option("userName", jdbcUser)
            .option("password", jdbcPassword)
            .option("dbtable", tableName) // NullPointerException occurs here
            .load()
  }
}

我想从 Spark 连接到 LAN 上的 Postgres 数据库。在运行时,会出现以下错误:

Exception in thread "main" java.lang.NullPointerException
    at org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$.resolveTable(JDBCRDD.scala:71)
    at org.apache.spark.sql.execution.datasources.jdbc.JDBCRelation$.getSchema(JDBCRelation.scala:210)
    at org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.createRelation(JdbcRelationProvider.scala:35)
    at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:318)
    at org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:223)
    at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:211)
    at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:167)
    at <redacted>?.main(App.scala:42)
    at <redacted>.App.main(App.scala)

是否有明显的原因导致该option("dbtable", tableName)行出现空指针异常?我将 spark-2.3.1-bin-hadoop2.7 与 Scala 2.11.12 一起使用。对于 postgres 依赖项,我使用的是这个版本:

        <dependency>
            <groupId>org.postgresql</groupId>
            <artifactId>postgresql</artifactId>
            <version>9.4-1200-jdbc41</version>
        </dependency>

标签: postgresqlscalaapache-spark

解决方案


错误消息(对故障排除没有多大帮助)可能不是针对 option dbtable,而是针对 option url

看起来您jdbcUrl缺少 URL 协议jdbc:postgresql://作为其前缀。这是一个链接:Spark 的 JDBC 数据源


推荐阅读