首页 > 解决方案 > 如何将一个分区的数据插入/复制到配置单元中的多个分区?

问题描述

我的蜂巢表中有数据day='2019-01-01',我想将相同的数据复制到整个 2019 年 1 月。(即在'2019-01-02', '2019-01-03'... '2019-01-31'

我正在尝试关注,但数据仅插入“2019-01-02”而不是“2019-01-03”。

INSERT OVERWRITE TABLE db_t.students PARTITION(dt='2019-01-02', dt='2019-01-03')
SELECT id, name, marks FROM db_t.students WHERE dt='2019-01-01';

标签: hivecalendarhiveqldate-rangehive-partitions

解决方案


将所有数据与所需日期范围的日历日期交叉连接。使用动态分区:

set hivevar:start_date=2019-01-02; 
set hivevar:end_date=2019-01-31; 

set hive.exec.dynamic.partition=true; 
set hive.exec.dynamic.partition.mode=nonstrict;  

with date_range as 
(--this query generates date range
select date_add ('${hivevar:start_date}',s.i) as dt 
  from ( select posexplode(split(space(datediff('${hivevar:end_date}','${hivevar:start_date}')),' ')) as (i,x) ) s
)

INSERT OVERWRITE TABLE db_t.students PARTITION(dt)
SELECT id, name, marks, r.dt --partition column is the last one
  FROM db_t.students s 
       CROSS JOIN date_range r
 WHERE s.dt='2019-01-01'
DISTRIBUTE BY r.dt;

另一种可能的解决方案是使用hadoop fs -cpor复制分区数据hadoop distcp(对每个分区重复或在 shell 中使用循环):

hadoop fs -cp '/usr/warehouse/students/dt=2019-01-01' '/usr/warehouse/students/dt=2019-01-02'

还有一种使用 UNION ALL 的解决方案:

    set hive.exec.dynamic.partition=true; 
    set hive.exec.dynamic.partition.mode=nonstrict;      

    INSERT OVERWRITE TABLE db_t.students PARTITION(dt)
    SELECT id, name, marks, '2019-01-02' as dt FROM db_t.students s WHERE s.dt='2019-01-01'
    UNION ALL
     SELECT id, name, marks, '2019-01-03' as dt FROM db_t.students s WHERE s.dt='2019-01-01'
    UNION ALL
     SELECT id, name, marks, '2019-01-04' as dt FROM db_t.students s WHERE s.dt='2019-01-01' 
    UNION ALL
    ... 
  ;

推荐阅读