首页 > 解决方案 > 使用 Scala 从 HDFS 读取文件并使用它创建 RDD

问题描述

我正在尝试使用 Scala 在 HDFS 中加载一些文件。

但是,当我尝试加载它时,我遇到了同样的错误。

位置 HDFS 文件: hdfs/test/dir/text.txt

(我在 /dir 中有更多文件)

我的代码:

// Spark Packages
import org.apache.spark.SparkContext
import org.apache.spark.SparkConf

// Initializing Spark
val conf = new SparkConf().setAppName("training").setMaster("master")
new SparkContext(conf)

// Read files from HDFS and convert to RDD.
val rdd = sc.textFile("/test/dir/*")

我的错误:

18/04/29 05:44:30 INFO storage.MemoryStore: ensureFreeSpace(280219) called with curMem=301375, maxMem=257918238
18/04/29 05:44:30 INFO storage.MemoryStore: Block broadcast_1 stored as values in memory (estimated size 273.7 KB, free 245.4 MB)
18/04/29 05:44:31 INFO storage.MemoryStore: ensureFreeSpace(21204) called with curMem=581594, maxMem=257918238
18/04/29 05:44:31 INFO storage.MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 20.7 KB, free 245.4 MB)
18/04/29 05:44:31 ERROR actor.OneForOneStrategy: 
java.lang.NullPointerException
    at org.apache.spark.storage.BlockManagerMasterActor.org$apache$spark$storage$BlockManagerMasterActor$$updateBlockInfo(BlockManagerMasterActor.scala:359)
    at org.apache.spark.storage.BlockManagerMasterActor$$anonfun$receiveWithLogging$1.applyOrElse(BlockManagerMasterActor.scala:75)
    at scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)
    at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)
    at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)
    at org.apache.spark.util.ActorLogReceive$$anon$1.apply(ActorLogReceive.scala:53)
    at org.apache.spark.util.ActorLogReceive$$anon$1.apply(ActorLogReceive.scala:42)
    at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:118)
    at 

和更多...

我该如何解决这个问题?还是因为我的语法错误?

非常感谢您提前。

标签: scalaapache-sparkhdfs

解决方案


删除以下内容允许我运行代码:

// Initializing Spark
val conf = new SparkConf().setAppName("training").setMaster("master")
new SparkContext(conf)

推荐阅读