首页 > 解决方案 > 带有日期的操作 pyspark

问题描述

对于每个交货日期,我想检查在接下来的 7 天内是否有另一个交货或电话!

这就是我所拥有的:

+------+----------+----------+----------+------+
|id    |delivery  |call      |n_delivery|n_call|
+------+----------+----------+----------+------+
|a     |2018-10-19|null      |1         |0     |
|a     |2018-10-31|null      |1         |0     |
|a     |null      |2018-10-29|0         |1     |
|a     |2018-10-31|null      |1         |0     |
|a     |null      |2018-10-30|0         |1     |
|a     |2018-10-12|null      |1         |0     |
+------+----------+----------+----------+------+

这就是我想要的:

+------+----------+----------+----------+------+------+
|id    |delivery  |call      |n_delivery|n_call|target|
+------+----------+----------+----------+------+------+
|a     |2018-10-19|null      |1         |0     |0     |
|a     |2018-10-31|null      |1         |0     |0     |
|a     |null      |2018-10-29|0         |1     |0     |
|a     |2018-10-31|null      |1         |0     |0     |
|a     |null      |2018-10-30|0         |1     |0     |
|a     |2018-10-12|null      |1         |0     |1     |
+------+----------+----------+----------+------+------+

我使用窗口功能,但我真的不知道如何使用它。

days = lambda i: i * 86400 

w1 = Window().partitionBy("id").orderBy(col('delivery').cast("timestamp").cast("long")).rangeBetween(0,days(7))

w2 = Window().partitionBy("id").orderBy(col('call').cast("timestamp").cast("long")).rangeBetween(0,days(7))

我尝试计算 n_delivery 和 n_call 并在基于新 cols 构建目标之后!但结果不正确。

dt1.select(col("*"), f.count('n_delivery').over(w1).alias('n_range_del'), f.count('n_call').over(w2).alias('n_range_call'))

任何人都可以帮助我吗?谢谢!

标签: pythonapache-sparkpyspark

解决方案


使用rangeBetween是可能的,但可能不如使用更简单WindowSpec并创建几个中间数据列那么直接。

这是我想出的似乎可行的解决方案:

"""
+------+----------+----------+----------+------+
|id    |delivery  |call      |n_delivery|n_call|
+------+----------+----------+----------+------+
|a     |2018-10-19|null      |1         |0     |
|a     |2018-10-31|null      |1         |0     |
|a     |null      |2018-10-29|0         |1     |
|a     |2018-10-31|null      |1         |0     |
|a     |null      |2018-10-30|0         |1     |
|a     |2018-10-12|null      |1         |0     |
+------+----------+----------+----------+------+
"""
# Create Data Frame with Example Data
data = [[1,2,3,4,5,6], ['a','a','a','a','a','a'], ['2018-10-19', '2018-10-31', '', '2018-10-31', '', '2018-10-12'], ['', '', '2018-10-29', '', '2018-10-30', ''], [1,1,0,1,0,1], [0,0,1,0,1,0]]
cols = ['row_num', 'id', 'delivery', 'call', 'n_delivery', 'n_call']
df_pd = pd.DataFrame(data).T
df_pd.columns = cols
df = spark.createDataFrame(df_pd)

# Convert Date Cols to Date Type
df = df.withColumn('delivery', F.to_timestamp(F.col('delivery'), 'yyyy-MM-dd').cast(T.DateType()))
df = df.withColumn('call', F.to_timestamp(F.col('call'), 'yyyy-MM-dd').cast(T.DateType()))

# Get coalesced column of delivery | call. 
# This logic will work as long as each row has *either* a call xor delivery date or if it has both and they're the same
df = df.withColumn('delivery_or_call', F.coalesce(df['delivery'], df['call']))

# Create window function to get *next* delivery or call date for every delivery row
w_delivery_or_call = Window().partitionBy('id').orderBy(F.col('delivery_or_call').asc()) 
df = df.withColumn('next_delivery_or_call', F.when(F.col('n_delivery') + F.col('n_call') > 0, F.lag(F.col('delivery_or_call'), count=-1).over(w_delivery_or_call)).otherwise(None))

# Calc target
df = df.withColumn('target', F.when((F.datediff(F.col('next_delivery_or_call'), F.col('delivery')) > 0) & (F.datediff(F.col('next_delivery_or_call'), F.col('delivery')) <= 7), 1).otherwise(0))

df.orderBy('row_num').show()

这会产生所需的目标:

+-------+---+----------+----------+----------+------+----------------+------------------------------------+------+ 
|row_num| id| delivery |call      |n_delivery|n_call|delivery_or_call|next_delivery_or_call_given_delivery|target| 
+-------+---+----------+----------+----------+------+----------------+------------------------------------+------+ 
| 6     | a |2018-10-12|      null|         1|     0|      2018-10-12|                          2018-10-19|     1| 
| 1     | a |2018-10-19|      null|         1|     0|      2018-10-19|                          2018-10-29|     0| 
| 3     | a |      null|2018-10-29|         0|     1|      2018-10-29|                                null|     0| 
| 5     | a |      null|2018-10-30|         0|     1|      2018-10-30|                                null|     0| 
| 4     | a |2018-10-31|      null|         1|     0|      2018-10-31|                          2018-10-31|     0| 
| 2     | a |2018-10-31|      null|         1|     0|      2018-10-31|                                null|     0| 
+-------+---+----------+----------+----------+------+----------------+------------------------------------+------+

推荐阅读