首页 > 解决方案 > Spark Dataframes:使用Window PARTITION函数语法时的CASE语句

问题描述

我需要检查一个 Condition 是否 ReasonCode 是 "YES" ,然后使用 ProcessDate 作为 PARTITION 列之一,否则不要。

等效的 SQL 查询如下:

SELECT PNum, SUM(SIAmt) OVER (PARTITION BY PNum,
                                           ReasonCode , 
                                           CASE WHEN ReasonCode = 'YES' THEN ProcessDate ELSE NULL END 
                              ORDER BY ProcessDate RANGE BETWEEN UNBOUNDED PRECEDING AND CURRENT ROW) SumAmt 
from TABLE1

到目前为止,我已经尝试了以下查询,但无法合并条件

Spark Dataframes 中的“CASE WHEN ReasonCode = 'YES' THEN ProcessDate ELSE NULL END”

val df = inputDF.select("PNum")
.withColumn("SumAmt", sum("SIAmt").over(Window.partitionBy("PNum","ReasonCode").orderBy("ProcessDate")))

输入数据:

---------------------------------------
Pnum    ReasonCode  ProcessDate SIAmt
---------------------------------------
1       No          1/01/2016   200
1       No          2/01/2016   300
1       Yes         3/01/2016   -200
1       Yes         4/01/2016   200
---------------------------------------

预期输出:

---------------------------------------------
Pnum    ReasonCode  ProcessDate SIAmt  SumAmt
---------------------------------------------
1       No          1/01/2016   200     200 
1       No          2/01/2016   300     500
1       Yes         3/01/2016   -200    -200
1       Yes         4/01/2016   200      200
---------------------------------------------

关于 Spark 数据框而不是 spark-sql 查询的任何建议/帮助?

标签: scalaapache-sparkapache-spark-sql

解决方案


您可以以 api 形式应用与 SQL 完全相同的副本

import org.apache.spark.sql.functions._
import org.apache.spark.sql.expressions._
val df = inputDF
  .withColumn("SumAmt", sum("SIAmt").over(Window.partitionBy(col("PNum"),col("ReasonCode"), when(col("ReasonCode") === "Yes", col("ProcessDate")).otherwise(null)).orderBy("ProcessDate")))

你也可以添加.rowsBetween(Long.MinValue, 0)部分,这应该给你

+----+----------+-----------+-----+------+
|Pnum|ReasonCode|ProcessDate|SIAmt|SumAmt|
+----+----------+-----------+-----+------+
|   1|       Yes|  4/01/2016|  200|   200|
|   1|        No|  1/01/2016|  200|   200|
|   1|        No|  2/01/2016|  300|   500|
|   1|       Yes|  3/01/2016| -200|  -200|
+----+----------+-----------+-----+------+

推荐阅读