首页 > 解决方案 > DataStax Cassandra - Scala Spark 应用程序 - SBT 构建失败

问题描述

我有一个简单的演示 Scala 应用程序,它从文件中读取并输出到屏幕。我正在尝试使用 sbt 构建此应用程序并将其提交给 DataStax Spark。DataStax 文档中的 SBT 说明似乎不完整。https://docs.datastax.com/en/dse/6.0/dse-dev/datastax_enterprise/spark/sparkJavaApi.html 由于缺少指向datastax repo的链接,因此无法按原样使用。

在搜索了一下之后,从https://github.com/datastax/SparkBuildExamples/blob/master/scala/sbt/dse/build.sbt找到了一个样本 build.sbt 文件,它走得最远。

这个在这里失败了:

[error] unresolved dependency: org.apache.directory.api#api-ldap-codec-standalone;1.0.0.2.dse: not found
[error] unresolved dependency: org.apache.directory.api#api-ldap-extras-codec;1.0.0.2.dse: not found
[error] unresolved dependency: org.apache.directory.api#api-ldap-net-mina;1.0.0.2.dse: not found
[error] unresolved dependency: org.apache.directory.api#api-ldap-codec-core;1.0.0.2.dse: not found
[error] unresolved dependency: org.apache.directory.api#api-ldap-extras-aci;1.0.0.2.dse: not found
[error] unresolved dependency: org.apache.directory.api#api-ldap-extras-codec-api;1.0.0.2.dse: not found
[error] unresolved dependency: org.apache.directory.api#api-ldap-model;1.0.0.2.dse: not found
[error] unresolved dependency: org.apache.directory.api#api-asn1-ber;1.0.0.2.dse: not found
[error] unresolved dependency: org.apache.directory.api#api-util;1.0.0.2.dse: not found
[error] unresolved dependency: org.apache.directory.api#api-asn1-api;1.0.0.2.dse: not found
[error] unresolved dependency: org.apache.directory.api#api-i18n;1.0.0.2.dse: not found

build.sbt 的关键部分是:

scalaVersion := "2.11.8"

resolvers += Resolver.mavenLocal // for testing
resolvers += "DataStax Repo" at "https://repo.datastax.com/public-repos/"

val dseVersion = "6.0.0"

libraryDependencies += "com.datastax.dse" % "dse-spark-dependencies" % dseVersion % "provided" exclude(
    "org.slf4j", "log4j-over-slf4j", "org.apache.directory.api")

libraryDependencies ++= Seq(
  "junit" % "junit" % "4.12" % "test"
).map(_.excludeAll(
  ExclusionRule("org.slf4j","log4j-over-slf4j"),
  ExclusionRule("org.slf4j","slf4j-log4j12"))
)  // Excluded to allow for Cassandra to run embedded

似乎是一个破碎的依赖。你能给些建议么。

标签: scalaapache-sparksbtdatastax-enterprise

解决方案


请尝试以下依赖项

scalaVersion := "2.11.8"

libraryDependencies += "com.datastax.spark" %% "spark-cassandra-connector" % "2.0.9"

推荐阅读