首页 > 解决方案 > 收集Json中DataFrame列的数据

问题描述

我有一个 DataFrame 有两列作为“键”:id1id2

val df1 = Seq(
  (1, 11, "n1", "d1"),
  (1, 22, "n2", "d2"),
  (2, 11, "n3", "d3"),
  (2, 11, "n4", "d4")
).toDF("id1", "id2", "number", "data")

scala> df1.show
+---+---+------+----+
|id1|id2|number|data|
+---+---+------+----+
|  1| 11|    n1|  d1|
|  1| 22|    n2|  d2|
|  2| 11|    n3|  d3|
|  2| 11|    n4|  d4|
+---+---+------+----+

我想得到 Json,按数据框的键分组,如下所示:

+---+---+-------+----------------------------------------------------------+
|id1|id2| json                                                             |
+---+---+-------+----------------------------------------------------------+
|  1| 11|[{"number" : "n1", "data": "d1"}]                                 |
|  1| 22|[{"number" : "n2", "data": "d2"}]                                 |
|  2| 11|[{"number" : "n3", "data": "d3"}, {"number" : "n4", "data": "d4"}]|
+---+---+-------+----------------------------------------------------------+

版本:

Spark: 2.2
Scala: 2.11

标签: jsonscalaapache-sparkapache-spark-sql

解决方案


这可以通过首先使用to_jsonnumberanddata列转换为 json 格式来完成。然后在两个 id 列上使用groupBywithcollect_list来获得想要的结果。

val df2 = df1.withColumn("json", to_json(struct($"number", $"data")))
  .groupBy("id1", "id2").agg(collect_list($"json"))

推荐阅读