首页 > 解决方案 > java.lang.NoClassDefFoundError: org/apache/htrace/core/Tracer$Builder - Scala Spark

问题描述

我正在尝试使用 Scala Spark 执行 WordCount 示例,但遇到以下问题:

java.lang.NoClassDefFoundError: org/apache/htrace/core/Tracer$Builder
    at org.apache.hadoop.fs.FsTracer.get(FsTracer.java:42)
    at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:3232)
    at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:123)
    at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:3286)
    at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:3254)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:478)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:226)
    at es.santander.prueba.pruebas.WordCount.count(WordCount.scala:123)
    at es.santander.prueba.pruebas.WordCount.main(WordCount.scala:50)
    at es.santander.prueba.pruebas.Main$.main(WordCount.scala:27)
    at es.santander.prueba.pruebas.Main.main(WordCount.scala)
Caused by: java.lang.ClassNotFoundException: org.apache.htrace.core.Tracer$Builder
    at java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:582)
    at java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178)
    at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:521)
    ... 11 more

我一直在寻找解决这个问题的方法,并在我的 pom.xml 中添加了以下 jar:

<!-- https://mvnrepository.com/artifact/org.apache.hbase/hbase -->
<dependency>
    <groupId>org.apache.hbase</groupId>
    <artifactId>hbase</artifactId>
    <version>2.4.5</version>
    <type>pom</type>
</dependency>


<!-- https://mvnrepository.com/artifact/org.htrace/htrace-core -->
<dependency>
    <groupId>org.htrace</groupId>
    <artifactId>htrace-core</artifactId>
    <version>3.0.4</version>
</dependency>

<!-- https://mvnrepository.com/artifact/org.apache.htrace/htrace-hbase -->
<dependency>
    <groupId>org.apache.htrace</groupId>
    <artifactId>htrace-hbase</artifactId>
    <version>4.1.0-incubating</version>
</dependency>

但是问题仍然存在,我做错了什么还是我需要更多的东西?

编辑:这是我遇到错误的地方:

    FileSystem.get(spark.sparkContext.hadoopConfiguration).delete(new Path(outputFile), true)

标签: scalaapache-spark

解决方案


推荐阅读