首页 > 解决方案 > Scala / sbt java.lang.ClassNotFoundException: com.ullink.slack.simpleslackapi.listeners.SlackMessagePostedListener

问题描述

我想将 slack api 包含到 scala / sbt 中的 spark 流项目中当我运行程序时,我得到一个类未找到异常 - 我猜有依赖问题?

错误:

Exception in thread "main" java.lang.NoClassDefFoundError: com/ullink/slack/simpleslackapi/listeners/SlackMessagePostedListener at org.apache.spark.sql.streaming.DataStreamReader.load(DataStreamReader.scala:175) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:894) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:198) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:228) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:137) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) Caused by: java.lang.ClassNotFoundException: com.ullink.slack.simpleslackapi.listeners.SlackMessagePostedListener at java.net.URLClassLoader.findClass(URLClassLoader.java:381) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) ... 17 more

build.sbt 看起来像:

 name := "slackKnowledgeCollector"
    version := "0.1"
    scalaVersion := "2.11.8"

    val sparkVersion = "2.3.1"

    resolvers ++= Seq(
      "Hortonworks" at "http://repo.hortonworks.com/content/repositories/releases/",
      "Hortonworks Groups" at "http://repo.hortonworks.com/content/groups/public/",
      "Apache Snapshots" at "https://repository.apache.org/content/repositories/releases/",
      "Maven Central" at "http://central.maven.org/maven2/"
Resolver.mavenLocal
)

    libraryDependencies ++= Seq(
      "org.apache.spark" %% "spark-core" % sparkVersion % Provided,
      "org.apache.spark" %% "spark-sql" % sparkVersion % Provided,
      "org.apache.spark" %% "spark-sql-kafka-0-10" % sparkVersion,
      //"org.apache.spark" %% "spark-streaming" % sparkVersion % Provided,
      //"org.apache.spark" %% "spark-streaming-kafka-0-10" % sparkVersion % Provided,
      "org.apache.kafka" %% "kafka" % "0.10.0.2.5.3.0-37",
      "com.ullink.slack" % "simpleslackapi" % "1.2.0" excludeAll(
        ExclusionRule(organization = "org.apache.httpcomponents"),
        ExclusionRule(organization = "com.google.guava"),
        ExclusionRule(organization = "ch.qos.logback"),
        ExclusionRule(organization = "org.slf4j")
      )
    )

    assemblyOption in assembly := (assemblyOption in assembly).value.copy(includeScala = false, cacheOutput = false)
    test in assembly := {}
    assemblyMergeStrategy in assembly := {
      case PathList("org", "apache", "spark", "unused", "UnusedStubClass.class") => MergeStrategy.discard
      case x =>
        val oldStrategy = (assemblyMergeStrategy in assembly).value
        oldStrategy(x)
    }

    scalacOptions += "-target:jvm-1.8"

有人知道如何解决这个问题吗?

标签: scalasbtslack-apiapache-spark-2.0

解决方案


问题解决了……我以前叫普通罐子而不是超级罐子。线程可以关闭


推荐阅读