scala - 为 apache phoenix 导入 sbt 项目时出错
问题描述
我是初学者,我正在尝试使用 sbt 导入 phoenix 库以在 spark 中读取 hbase 表,但我的 build.sbt 不断给我错误。
导入 sbt 项目时出错:
[error] stack trace is suppressed; run 'last update' for the full output
[error] stack trace is suppressed; run 'last ssExtractDependencies' for the full output
[error] (update) sbt.librarymanagement.ResolveException: Error downloading org.apache.hbase:hbase-common:${cdh.hbase.version}
[error] Not found
[error] Not found
[error] not found: /Users/johnny/.ivy2/local/org.apache.hbase/hbase-common/${cdh.hbase.version}/ivys/ivy.xml
[error] not found: https://repo1.maven.org/maven2/org/apache/hbase/hbase-common/${cdh.hbase.version}/hbase-common-${cdh.hbase.version}.pom
[error] Error downloading org.apache.hbase:hbase-hadoop-compat:${cdh.hbase.version}
[error] Not found
[error] Not found
[error] not found: /Users/johnny/.ivy2/local/org.apache.hbase/hbase-hadoop-compat/${cdh.hbase.version}/ivys/ivy.xml
[error] not found: https://repo1.maven.org/maven2/org/apache/hbase/hbase-hadoop-compat/${cdh.hbase.version}/hbase-hadoop-compat-${cdh.hbase.version}.pom
[error] not found: https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-common/${cdh.hadoop.version}/hadoop-common-${cdh.hadoop.version}.pom
[error] Error downloading org.apache.hbase:hbase-hadoop2-compat:${cdh.hbase.version}
[error] Not found
[error] Not found
[error] not found: /Users/johnny/.ivy2/local/org.apache.hbase/hbase-hadoop2-compat/${cdh.hbase.version}/ivys/ivy.xml
[error] not found: https://repo1.maven.org/maven2/org/apache/hbase/hbase-hadoop2-compat/${cdh.hbase.version}/hbase-hadoop2-compat-${cdh.hbase.version}.pom
[error] Error downloading org.apache.hbase:hbase-annotations:${cdh.hbase.version}
[error] Not found
[error] Not found
[error] not found: /Users/johnny/.ivy2/local/org.apache.hbase/hbase-annotations/${cdh.hbase.version}/ivys/ivy.xml
[error] not found: https://repo1.maven.org/maven2/org/apache/hbase/hbase-annotations/${cdh.hbase.version}/hbase-annotations-${cdh.hbase.version}.pom
[error] Error downloading org.apache.hbase:hbase-protocol:${cdh.hbase.version}
[error] Not found
[error] Not found
[error] not found: /Users/johnny/.ivy2/local/org.apache.hbase/hbase-protocol/${cdh.hbase.version}/ivys/ivy.xml
[error] not found: https://repo1.maven.org/maven2/org/apache/hbase/hbase-protocol/${cdh.hbase.version}/hbase-protocol-${cdh.hbase.version}.pom
[error] Error downloading org.apache.hbase:hbase-client:${cdh.hbase.version}
[error] Not found
[error] Not found
[error] not found: /Users/johnny/.ivy2/local/org.apache.hbase/hbase-client/${cdh.hbase.version}/ivys/ivy.xml
[error] not found: https://repo1.maven.org/maven2/org/apache/hbase/hbase-client/${cdh.hbase.version}/hbase-client-${cdh.hbase.version}.pom
[error] Error downloading org.apache.hbase:hbase-server:${cdh.hbase.version}
[error] Not found
[error] Not found
[error] not found: /Users/johnny/.ivy2/local/org.apache.hbase/hbase-server/${cdh.hbase.version}/ivys/ivy.xml
[error] not found: https://repo1.maven.org/maven2/org/apache/hbase/hbase-server/${cdh.hbase.version}/hbase-server-${cdh.hbase.version}.pom
[error] Error downloading com.cloudera.cdh:cdh-root:5.11.2
[error] Not found
[error] Not found
[error] not found: /Users/johnny/.ivy2/local/com.cloudera.cdh/cdh-root/5.11.2/ivys/ivy.xml
[error] not found: https://repo1.maven.org/maven2/com/cloudera/cdh/cdh-root/5.11.2/cdh-root-5.11.2.pom
[error] Total time: 3 s, completed Sep 27, 2019, 4:54:09 PM
[info] shutting down sbt server)
我的 build.sbt 是:
name := "SparkHbase"
version := "0.1"
scalaVersion := "2.11.12"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "2.2.0" % "provided"
,"org.apache.spark" %% "spark-sql" % "2.2.0" % "provided"
,"org.apache.spark" %% "spark-hive" % "2.2.0" % "provided"
,"org.apache.phoenix" % "phoenix-spark" % "4.13.2-cdh5.11.2"
)
我什至包括了这个:resolvers += "ClouderaRepo" at "https://repository.cloudera.com/content/repositories/releases"
但仍然有错误。请问,我做错了什么?
解决方案
问题是您正在尝试使用非常旧的phoenix-spark
. 如果您有 HBase 1.3,则可以使用该版本4.14.3-HBase-1.3
,请参见build.sbt
:
name := "SparkHbase"
version := "0.1"
scalaVersion := "2.11.12"
resolvers += "Cloudera" at "https://repository.cloudera.com/content/repositories/releases/"
resolvers += "Cloudera_Artifactory" at "https://repository.cloudera.com/artifactory/cloudera-repos/"
resolvers += Resolver.sonatypeRepo("releases")
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "2.2.0" % "provided"
,"org.apache.spark" %% "spark-sql" % "2.2.0" % "provided"
,"org.apache.spark" %% "spark-hive" % "2.2.0" % "provided"
,"org.apache.phoenix" % "phoenix-spark" % "4.14.3-HBase-1.3"
)
推荐阅读
- bash - 尝试使用 bash 脚本抓取页面时出现 curl 1020 错误
- rust - 有没有办法在文档注释中内联一个常量(由货物文档呈现)?
- c# - Unity 点击并传递 WaitForSeconds
- c# - 如何在 C# 中将应用程序设置传递给邮件方法
- r - R:在集群/并行环境中按组聚合多个列
- javascript - 防止高阶数组方法抛出错误
- sql - 获取用户子集的子层次结构
- apache-kafka-connect - Camel aws-s3 Source Connector Error - 应该如何更改配置
- php - heroku php增加max_execution_time
- c# - 从 DataGridView 转换到 ObjectListView 以显示数据库中的图像并合并行