java - 如果将 org.apache.hive:hive-service 放入依赖项中,则 SparkSession 不起作用
问题描述
我正在用 Java 实现一个简单的程序,它使用 Spark SQL 从 Parquet 文件中读取,并构建一个FieldSchema对象的 ArrayList(在 hive 元存储中),其中每个对象代表一个具有其名称和数据类型的列。但是,如果导入 FieldSchema 类,Spark SQL 似乎无法共存。
例如,使用相同的程序如下:
import org.apache.spark.sql.SparkSession;
public class main {
public static void main(String[] args) {
SparkSession spark = SparkSession.builder().appName("Application Name").config("spark.master", "local").getOrCreate();
}
}
build.gradle (IntelliJ) 中的依赖项配置
dependencies {
testImplementation 'org.junit.jupiter:junit-jupiter-api:5.7.0'
testRuntimeOnly 'org.junit.jupiter:junit-jupiter-engine:5.7.0'
implementation 'org.apache.spark:spark-sql_2.12:3.1.1'
implementation 'org.apache.spark:spark-core_2.12:3.1.1'
}
使程序成功运行。
另一方面,这个依赖的配置(为了后面导入org.apache.hadoop.hive.metastore.api.FieldSchema)
dependencies {
testImplementation 'org.junit.jupiter:junit-jupiter-api:5.7.0'
testRuntimeOnly 'org.junit.jupiter:junit-jupiter-engine:5.7.0'
implementation "org.apache.hive:hive-service:+"
implementation 'org.apache.spark:spark-sql_2.12:3.1.1'
implementation 'org.apache.spark:spark-core_2.12:3.1.1'
}
输出错误
WARNING: An illegal reflective access operation has occurred
An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform (file:/Users/davidtran/.gradle/caches/modules-2/files-2.1/org.apache.spark/spark-unsafe_2.12/3.1.1/1c3b07cb82e71d0519e5222a5ff38758ab499034/spark-unsafe_2.12-3.1.1.jar) to constructor java.nio.DirectByteBuffer(long,int)
WARNING: Please consider reporting this to the maintainers of org.apache.spark.unsafe.Platform
Illegal reflective access by org.apache.spark.unsafe.Platform (file:/Users/davidtran/.gradle/caches/modules-2/files-2.1/org.apache.spark/spark-unsafe_2.12/3.1.1/1c3b07cb82e71d0519e5222a5ff38758ab499034/spark-unsafe_2.12-3.1.1.jar) to constructor java.nio.DirectByteBuffer(long,int)
Please consider reporting this to the maintainers of org.apache.spark.unsafe.Platform
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
Use --illegal-access=warn to enable warnings of further illegal reflective access operations
All illegal access operations will be denied in a future release
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Exception in thread "main" java.lang.NoSuchFieldError: JAVA_9
at org.apache.spark.storage.StorageUtils$.<init>(StorageUtils.scala:207)
at org.apache.spark.storage.StorageUtils$.<clinit>(StorageUtils.scala)
at org.apache.spark.storage.BlockManagerMasterEndpoint.<init>(BlockManagerMasterEndpoint.scala:109)
at org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:371)
at org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:311)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:359)
at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:189)
at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:277)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:458)
at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2678)
at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:942)
at scala.Option.getOrElse(Option.scala:189)
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:936)
at main.main(main.java:6)
解决方案
我能够通过从 org.apache.hadoop:hadoop-common pom.xml 依赖列表中排除 org.apache.commons:commons-lang3 来修复以下错误。
java.lang.NoSuchFieldError: JAVA_9
推荐阅读
- python - SQLAlchemy 中的 ORM 模型会自动创建表吗?
- python - 有没有办法在 tkinter 中复制框架或条目?
- javascript - 无法从请求正文中检索值
- java - Jacoco 无法涵盖仅包含静态方法的类
- apache - 在 .htaccess 中强制 https 与 URL 重写冲突
- android - NSString 类型的 JSON 值无法转换为 ABI38_0_0YGValue
- php - 按供应商分类的产品类别 - laravel
- python - 我想根据前一列添加一个新的 DataFrame 列,这样如果前一列元素与列表值匹配,则更改值
- java - Mongock Standalone 不执行变更日志
- webpack - Babel 不处理 ES6 导入 react-native 包