scala - GraphLoader 对象中的抽象方法错误
问题描述
我在 中创建了一个简单的项目Graphx
,当我尝试运行这个测试项目时,我得到了AbstractMethodError
,这个方法内部出现错误edgeListFile
,看起来与记录器相关的东西我看不到,请帮忙。
这是我的.scala file
object graphtest extends App {
import org.apache.spark.graphx.{GraphLoader, VertexId}
val spark = SparkSession.builder.master("local").appName("learning spark").getOrCreate
val sc = spark.sparkContext
val graph1 = GraphLoader.edgeListFile(spark.sparkContext, "E:\\code\\Cit-HepTh.txt")
val res: (VertexId, Int) = graph1.inDegrees.reduce((a, b) => if (a._2 > b._2) a else b)
graph1.edges.collect().take(10).foreach(println)
}
这是我的build.sbt
文件
name := "myproject"
version := "0.1"
scalaVersion := "2.11.8"
mainClass in (Compile, packageBin) := Some("myproject.Processor")
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "2.3.1",
"org.apache.spark" %% "spark-sql" % "2.3.1",
"org.scalatest" %% "scalatest" % "3.2.0-SNAP10" % Test,
"com.typesafe" % "config" % "1.3.1",
"org.apache.spark" %% "spark-mllib" % "2.0.1"
)
最后是完整失败的堆栈跟踪
Exception in thread "main" java.lang.AbstractMethodError
at org.apache.spark.internal.Logging$class.initializeLogIfNecessary(Logging.scala:99)
at org.apache.spark.graphx.GraphLoader$.initializeLogIfNecessary(GraphLoader.scala:28)
at org.apache.spark.internal.Logging$class.log(Logging.scala:46)
at org.apache.spark.graphx.GraphLoader$.log(GraphLoader.scala:28)
at org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54)
at org.apache.spark.graphx.GraphLoader$.logInfo(GraphLoader.scala:28)
at org.apache.spark.graphx.GraphLoader$.edgeListFile(GraphLoader.scala:96)
at aaa.graphtest$.delayedEndpoint$zettasense$graphtest$1(Test.scala:15)
at aaa.graphtest$delayedInit$body.apply(Test.scala:6)
at scala.Function0$class.apply$mcV$sp(Function0.scala:34)
at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
at scala.App$$anonfun$main$1.apply(App.scala:76)
at scala.App$$anonfun$main$1.apply(App.scala:76)
at scala.collection.immutable.List.foreach(List.scala:381)
at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:35)
at scala.App$class.main(App.scala:76)
at aaa.graphtest$.main(Test.scala:6)
at aaa.graphtest.main(Test.scala)
解决方案
这是一个库不匹配,我将spark-core
, spark-sql
,更新spark-mlib
到最新版本并且运行顺利,这是我build.sbt
现在的样子
name := "myproject"
version := "0.1"
scalaVersion := "2.11.8"
mainClass in(Compile, packageBin) := Some("myproject.Processor")
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "2.4.0",
"org.apache.spark" %% "spark-sql" % "2.4.0",
"org.scalatest" %% "scalatest" % "3.2.0-SNAP10" % Test,
"com.typesafe" % "config" % "1.3.1",
"org.apache.spark" %% "spark-mllib" % "2.4.0"
)
推荐阅读
- android - 蓝牙配置文件 A2DP 和 SPP 是否用于从蓝牙麦克风录制音频?
- python - python线程和它们之间的队列消息
- linux - 为什么挂载的 hostPath 在 GKE 的 kubernetes 上不起作用
- java - 试图在java中打印数组
- python - 将 CSV 文件读入带有重复条目的 python 字典
- angular - Angular 11 错误 TS7006:参数“xyz”隐式具有“任何”类型
- .net - 从自定义应用程序以编程方式发送电子邮件停止使用最新的 Outlook 365 更新
- plot - 在 python 中使用 nondatavals = -9999 绘制 xray 数据集
- office365 - 在 Microsoft Teams“Microsoft 365 for business”中创建课程
- selenium - 带有 Android Studio 的 web_view 中的 Selenium webdriver