首页 > 解决方案 > GraphLoader 对象中的抽象方法错误

问题描述

我在 中创建了一个简单的项目Graphx,当我尝试运行这个测试项目时,我得到了AbstractMethodError,这个方法内部出现错误edgeListFile,看起来与记录器相关的东西我看不到,请帮忙。

这是我的.scala file

object graphtest extends App  {

  import org.apache.spark.graphx.{GraphLoader, VertexId}

    val spark = SparkSession.builder.master("local").appName("learning spark").getOrCreate
    val sc = spark.sparkContext


    val graph1 = GraphLoader.edgeListFile(spark.sparkContext, "E:\\code\\Cit-HepTh.txt")
    val res: (VertexId, Int) = graph1.inDegrees.reduce((a, b) => if (a._2 > b._2) a else b)

graph1.edges.collect().take(10).foreach(println)

}

这是我的build.sbt文件

name := "myproject"

version := "0.1"

scalaVersion := "2.11.8"

mainClass in (Compile, packageBin) := Some("myproject.Processor")

libraryDependencies ++= Seq(
  "org.apache.spark" %% "spark-core" % "2.3.1",
  "org.apache.spark" %% "spark-sql" % "2.3.1",
  "org.scalatest" %% "scalatest" % "3.2.0-SNAP10" % Test,
  "com.typesafe" % "config" % "1.3.1",
  "org.apache.spark" %% "spark-mllib" % "2.0.1"
)

最后是完整失败的堆栈跟踪

Exception in thread "main" java.lang.AbstractMethodError
at                 org.apache.spark.internal.Logging$class.initializeLogIfNecessary(Logging.scala:99)
at org.apache.spark.graphx.GraphLoader$.initializeLogIfNecessary(GraphLoader.scala:28)
at org.apache.spark.internal.Logging$class.log(Logging.scala:46)
at org.apache.spark.graphx.GraphLoader$.log(GraphLoader.scala:28)
at org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54)
at org.apache.spark.graphx.GraphLoader$.logInfo(GraphLoader.scala:28)
at org.apache.spark.graphx.GraphLoader$.edgeListFile(GraphLoader.scala:96)
at aaa.graphtest$.delayedEndpoint$zettasense$graphtest$1(Test.scala:15)
at aaa.graphtest$delayedInit$body.apply(Test.scala:6)
at scala.Function0$class.apply$mcV$sp(Function0.scala:34)
at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
at scala.App$$anonfun$main$1.apply(App.scala:76)
at scala.App$$anonfun$main$1.apply(App.scala:76)
at scala.collection.immutable.List.foreach(List.scala:381)
at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:35)
at scala.App$class.main(App.scala:76)
at aaa.graphtest$.main(Test.scala:6)
at aaa.graphtest.main(Test.scala)

标签: scalaapache-sparkspark-graphx

解决方案


这是一个库不匹配,我将spark-core, spark-sql,更新spark-mlib到最新版本并且运行顺利,这是我build.sbt现在的样子

name := "myproject"

version := "0.1"

scalaVersion := "2.11.8"

mainClass in(Compile, packageBin) := Some("myproject.Processor")


libraryDependencies ++= Seq(
  "org.apache.spark" %% "spark-core" % "2.4.0",
  "org.apache.spark" %% "spark-sql" % "2.4.0",
  "org.scalatest" %% "scalatest" % "3.2.0-SNAP10" % Test,
  "com.typesafe" % "config" % "1.3.1",
  "org.apache.spark" %% "spark-mllib" % "2.4.0"
)

推荐阅读