首页 > 解决方案 > 在 Apache SPARK 中使用分层查询

问题描述

我正在尝试使用 Java 在 SPARK 中运行以下 SQL 查询:

Dataset<Row> perIDDf = sparkSession.read().format("jdbc").option("url", connection).option("dbtable", "CI_PER_PER").load();


            perIDDf.createOrReplaceTempView("CI_PER_PER");
            Dataset<Row> perPerDF = sparkSession.sql("select per_id1,per_id2 " + 
                    "from CI_PER_PER " + 
                    "start with per_id1='2001822000' " + 
                    "connect by prior per_id1=per_id2");
            perPerDF.show(10,false);

我收到以下错误:

Exception in thread "main" org.apache.spark.sql.catalyst.parser.ParseException:
mismatched input 'with' expecting <EOF>(line 1, pos 45)

== SQL ==
select per_id1,per_id2 from CI_PER_PER start with per_id1='2001822000' connect by prior per_id1=per_id2
---------------------------------------------^^^

        at org.apache.spark.sql.catalyst.parser.ParseException.withCommand(ParseDriver.scala:239)
        at org.apache.spark.sql.catalyst.parser.AbstractSqlParser.parse(ParseDriver.scala:115)
        at org.apache.spark.sql.execution.SparkSqlParser.parse(SparkSqlParser.scala:48)
        at org.apache.spark.sql.catalyst.parser.AbstractSqlParser.parsePlan(ParseDriver.scala:69)
        at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:638)
        at com.tfmwithspark.TestMaterializedView.main(TestMaterializedView.java:127)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:879)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:197)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:227)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:136)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

基本上我正在尝试在 SPARK 中使用分层查询。

不支持吗?

火花版本:2.3.0

标签: javaapache-sparkapache-spark-sql

解决方案


Spark 目前不支持分层查询,也不支持查询中的递归。以最有限的方式,是。

您可以对此进行近似,但这是艰巨的。这是一种方法,但我并不真正推荐它:http ://sqlandhadoop.com/how-to-implement-recursive-queries-in-spark/


推荐阅读