首页 > 解决方案 > Scala spark 将数据框中的一组列聚合为 JSON 字符串

问题描述

给定一个数据框,

+-----------------------------+
| id|  name| payable| strategy|
+-----------------------------+
|  0|   Joe|     100|     st-1|
|  1|   Tom|     200|     st-2|
|  2|  John|     300|     st-1|
+-----------------------------+

将每一行转换为 JSON 字符串的最有效方法是什么,如下所示,

{
  "payload": {
     "name": "Joe",
     "payments": [
         {
            "strategy": "st-1",
            "payable": 100
         }
     ]
  }
}

目前我有 UDF 来手动对提供的列进行字符串化,但我想知道是否有更好的方法来实现这一点。to_json方法是我迄今为止找到的最好的替代方法,但它只需要一列作为输入。

标签: scalaapache-spark

解决方案


usingto_json()是正确的做法,但内容需要struct酌情传递:

val df = Seq((0,"Joe",100,"st-1"), (1,"Tom",200,"st-2")).toDF("id","name","payable","strategy")

val result = df.select(
  to_json(struct(
    struct($"name",
      array(struct($"strategy",$"payable")) as "payments"
    ) as "payload")
  ) as "jsonValue"
 )

result.show(false)
+-------------------------------------------------------------------------+
|jsonValue                                                                |
+-------------------------------------------------------------------------+
|{"payload":{"name":"Joe","payments":[{"strategy":"st-1","payable":100}]}}|
|{"payload":{"name":"Tom","payments":[{"strategy":"st-2","payable":200}]}}|
+-------------------------------------------------------------------------+

推荐阅读