首页 > 解决方案 > HDFS Kafka Connect - Hive 集成创建表异常

问题描述

我正在尝试将数据下沉到 hdfs 并且工作正常,但是在配置配置单元以查看数据时,我收到与 URI 相关的错误。

我已经尝试使用 store.url 代替 hdfs.url 但因空指针异常而失败。

我的 hdfs-sink.json 配置:

"connector.class": "io.confluent.connect.hdfs3.Hdfs3SinkConnector",
"tasks.max": "1",
"topics": "users",
"hdfs.url": "hdfs://192.168.1.221:9000",
"flush.size": "5",
"key.converter": "org.apache.kafka.connect.storage.StringConverter",
"value.converter": "io.confluent.connect.avro.AvroConverter",
"confluent.topic.bootstrap.servers": "localhost:9092",
"confluent.topic.replication.factor": "1",
"key.converter.schema.registry.url":"http://localhost:8081" ,
"value.converter.schema.registry.url":"http://localhost:8081",
"hive.integration":"true",
"hive.metastore.uris":"thrift://192.168.1.221:9083",
"schema.compatibility":"BACKWARD"

我收到以下错误:

[2019-09-12 15:12:33,533] ERROR Creating Hive table threw unexpected error (io.confluent.connect.hdfs3.TopicPartitionWriter)
io.confluent.connect.storage.errors.HiveMetaStoreException: Hive MetaStore exception
    at io.confluent.connect.storage.hive.HiveMetaStore.doAction(HiveMetaStore.java:99)
    at io.confluent.connect.storage.hive.HiveMetaStore.createTable(HiveMetaStore.java:223)
    at io.confluent.connect.hdfs3.avro.AvroHiveUtil.createTable(AvroHiveUtil.java:52)
    at io.confluent.connect.hdfs3.DataWriter$3.createTable(DataWriter.java:285)
    at io.confluent.connect.hdfs3.TopicPartitionWriter$1.call(TopicPartitionWriter.java:796)
    at io.confluent.connect.hdfs3.TopicPartitionWriter$1.call(TopicPartitionWriter.java:792)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    at java.lang.Thread.run(Thread.java:748)
Caused by: MetaException(message:java.lang.IllegalArgumentException: java.net.URISyntaxException: Relative path in absolute URI: hdfs://192.168.1.221:9000./null/topics/users)

ERROR Altering Hive schema threw unexpected error (io.confluent.connect.hdfs3.TopicPartitionWriter)
io.confluent.connect.storage.errors.HiveMetaStoreException: Hive table not found: default.users
    at io.confluent.connect.storage.hive.HiveMetaStore$9.call(HiveMetaStore.java:297)
    at io.confluent.connect.storage.hive.HiveMetaStore$9.call(HiveMetaStore.java:290)
    at io.confluent.connect.storage.hive.HiveMetaStore.doAction(HiveMetaStore.java:97)
    at io.confluent.connect.storage.hive.HiveMetaStore.getTable(HiveMetaStore.java:303)
    at io.confluent.connect.hdfs3.avro.AvroHiveUtil.alterSchema(AvroHiveUtil.java:61)
    at io.confluent.connect.hdfs3.DataWriter$3.alterSchema(DataWriter.java:290)
    at io.confluent.connect.hdfs3.TopicPartitionWriter$2.call(TopicPartitionWriter.java:811)
    at io.confluent.connect.hdfs3.TopicPartitionWriter$2.call(TopicPartitionWriter.java:807)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    at java.lang.Thread.run(Thread.java:748)`

标签: hiveapache-kafkahdfsapache-kafka-connectconfluent-platform

解决方案


推荐阅读