首页 > 解决方案 > hconnection-0x1544bdb1 已关闭 Spark 流中的 HBase 问题

问题描述

我正在尝试从 Spark 流中读取 HBase 表。我已使用以下代码连接到 HBase。

val conf: Configuration = HBaseConfiguration.create()
val hbaseConfiguration:Configuration = new Configuration()
hbaseConfiguration.addResource(new Path("/etc/hbase/conf/hbase-site.xml"))
val custTable: Table = connection.getTable(TableName.valueOf("cust:CustTable"))

但是我遇到了以下问题。

  at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:147)
        at org.apache.hadoop.hbase.client.HTable.get(HTable.java:935)
        at org.apache.hadoop.hbase.client.HTable.get(HTable.java:901)
        at com.optus.ndc.lbs.LbsEnrichment.EnrichS1apRecord$.fn_enrichment(EnrichS1apRecord.scala:245)
        at com.optus.ndc.lbs.LbsEnrichment.EnrichS1apRecord$$anonfun$3$$anonfun$5.apply(EnrichS1apRecord.scala:299)
        at com.optus.ndc.lbs.LbsEnrichment.EnrichS1apRecord$$anonfun$3$$anonfun$5.apply(EnrichS1apRecord.scala:299)
        at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
        at org.apache.spark.util.Utils$.getIteratorSize(Utils.scala:1598)
        at org.apache.spark.rdd.RDD$$anonfun$count$1.apply(RDD.scala:1157)
        at org.apache.spark.rdd.RDD$$anonfun$count$1.apply(RDD.scala:1157)
        at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1870)
        at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1870)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
        at org.apache.spark.scheduler.Task.run(Task.scala:89)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:229)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
Caused by: java.io.IOException: hconnection-0x1544bdb1 closed
        at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getKeepAliveZooKeeperWatcher(ConnectionManager.java:1806)
        at org.apache.hadoop.hbase.client.ZooKeeperRegistry.isTableOnlineState(ZooKeeperRegistry.java:122)
        at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.isTableDisabled(ConnectionManager.java:993)
        at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.relocateRegion(ConnectionManager.java:1162)
        at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.relocateRegion(ConnectionManager.java:1150)
        at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getRegionLocation(ConnectionManager.java:971)
        at org.apache.hadoop.hbase.client.HRegionLocator.getRegionLocation(HRegionLocator.java:83)
        at org.apache.hadoop.hbase.client.RegionServerCallable.prepare(RegionServerCallable.java:79)
        at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:124)
        ... 17 more

我在提交程序时提到了以下选项。但它仍然没有工作。能否请你帮忙。

--files "/etc/hbase/conf/hbase-site.xml" --driver-class-path "/etc/hbase/conf/"</p>

谢谢

标签: hbasespark-streaming

解决方案


推荐阅读