首页 > 解决方案 > 无法将属性文件从本地加载到 Spark:FileNotFound 异常

问题描述

我正在尝试在 IntelliJ IDEA 上学习 Scala-Spark JDBC 程序。为此,我创建了一个 Scala SBT 项目,项目结构如下所示:

在类中编写 JDBC 连接参数之前,我首先尝试加载一个包含我所有连接属性的属性文件,并尝试显示它们是否正确加载,如下所示:

devUserName=username
devPassword=password
gpDriverClass=org.postgresql.Driver
gpDevUrl=jdbc:url

我的连接属性文件在本地,而不是在 HDFS 上。

[hmusr@ip-xx-xxx-xxx-x inputdir]$ pwd
/home/hmusr/ReconTest/inputdir
[hmusr@ip-xx-xxx-xxx-x inputdir]$ ls
testconnection.properties  yearpartition_2.11-0.1.jar

代码:

package com.yearpartition.obj

import java.io.FileInputStream
import java.util.Properties
import org.apache.spark.sql.SparkSession
import org.apache.spark.SparkConf

object PartitionRetrieval {

  var conf = new SparkConf().setAppName("Spark-JDBC")
  val conFile = "file:///home/hmusr/ReconTest/inputdir/testconnection.properties"
  val properties = new Properties()
  properties.load(new FileInputStream(conFile))
  val connectionUrl = properties.getProperty("gpDevUrl")
  val devUserName=properties.getProperty("devUserName")
  val devPassword=properties.getProperty("devPassword")
  val gpDriverClass=properties.getProperty("gpDriverClass")

  println("connectionUrl: " + connectionUrl)

  Class.forName(gpDriverClass).newInstance()

  def main(args: Array[String]): Unit = {
    val spark = SparkSession.builder().enableHiveSupport().config(conf).master("local[2]").getOrCreate()
    println("connectionUrl: " + connectionUrl)
  }
}

build.sbt 的内容:

name := "YearPartition"

version := "0.1"

scalaVersion := "2.11.8"

libraryDependencies ++=  {
  val sparkCoreVer = "2.2.0"
  val sparkSqlVer = "2.2.0"
  Seq(
    "org.apache.spark" %% "spark-core" % sparkCoreVer % "provided" withSources(),
    "org.apache.spark" %% "spark-sql" % sparkSqlVer % "provided"  withSources(),
    "org.json4s" %% "json4s-jackson" % "3.2.11" % "provided",
    "org.apache.httpcomponents" % "httpclient" % "4.5.3"
  )
}

由于我没有将数据写入或保存到任何文件并尝试显示属性文件的值,因此我使用以下代码执行了代码:

SPARK_MAJOR_VERSION=2 spark-submit --class com.yearpartition.obj.PartitionRetrieval yearpartition_2.11-0.1.jar

但我收到文件未找到异常,如下所示:

Caused by: java.io.FileNotFoundException: file:/home/hmusr/ReconTest/inputdir/testconnection.properties (No such file or directory)

我试图修复它是徒劳的。谁能让我知道我在这里犯了什么错误,我该如何纠正?

标签: scalaapache-spark

解决方案


推荐阅读