首页 > 解决方案 > Spring批处理FlatFileItemWriter从Object写为csv

问题描述

我正在使用 Spring 批处理并有一个 ItemWriter,如下所示:

public class MyItemWriter implements ItemWriter<Fixing> {

    private final FlatFileItemWriter<Fixing> writer;
    private final FileSystemResource resource;

    public MyItemWriter () {
        this.writer = new FlatFileItemWriter<>();
        this.resource = new FileSystemResource("target/output-teste.txt");
    }


    @Override
    public void write(List<? extends Fixing> items) throws Exception {

        this.writer.setResource(new FileSystemResource(resource.getFile()));
        this.writer.setLineAggregator(new PassThroughLineAggregator<>());
        this.writer.afterPropertiesSet();
        this.writer.open(new ExecutionContext());
        this.writer.write(items);
    }

    @AfterWrite
    private void close() {
        this.writer.close();
    }
}

当我运行我的春季批处理作业时,这些项目被写入文件:

Fixing{id='123456', source='TEST', startDate=null, endDate=null}
Fixing{id='1234567', source='TEST', startDate=null, endDate=null}
Fixing{id='1234568', source='TEST', startDate=null, endDate=null}

1 /我怎样才能只写数据,以便值以逗号分隔,并且它为空,它不被写入。所以目标文件应该是这样的:

123456,TEST
1234567,TEST
1234568,TEST

2 / 其次,我遇到了一个问题,只有当我退出 spring boot 应用程序时,我才能看到文件被创建。我想要的是,一旦它处理了所有项目并写入,该文件就可以在不关闭 Spring Boot 应用程序的情况下使用。

标签: javaspringspring-bootspring-batch

解决方案


有多个选项可以写入 csv 文件。关于第二个问题 writer flush 将解决问题。

  1. https://howtodoinjava.com/spring-batch/flatfileitemwriter-write-to-csv-file/
  1. 我们更喜欢将 OpenCSV 与 Spring Batch 一起使用,因为我们正在获得更快的速度和对大文件示例片段的控制,如下所示

    class DocumentWriter implements ItemWriter<BaseDTO>, Closeable {
    
           private static final Logger LOG = LoggerFactory.getLogger(StatementWriter.class);
    
           private  ColumnPositionMappingStrategy<Statement> strategy ;
    
           private static final String[] columns = new String[] { "csvcolumn1", "csvcolumn2", "csvcolumn3",
    
                                        "csvcolumn4", "csvcolumn5", "csvcolumn6", "csvcolumn7"};
    
           private BufferedWriter writer;
           private StatefulBeanToCsv<Statement> beanToCsv;
           public DocumentWriter() throws Exception {
    
                          strategy = new ColumnPositionMappingStrategy<Statement>();
    
                         strategy.setType(Statement.class);
    
                          strategy.setColumnMapping(columns);
    
                          filename = env.getProperty("globys.statement.cdf.path")+"-"+processCount+".dat";
    
                          File cdf = new File(filename);
    
                   if(cdf.exists()){
    
                       writer = Files.newBufferedWriter(Paths.get(filename), StandardCharsets.UTF_8,StandardOpenOption.APPEND);
    
                   }else{
    
                       writer = Files.newBufferedWriter(Paths.get(filename), StandardCharsets.UTF_8,StandardOpenOption.CREATE_NEW);
    
                   }
    
                   beanToCsv = new StatefulBeanToCsvBuilder<Statement>(writer).withQuotechar(CSVWriter.NO_QUOTE_CHARACTER)
    
                           .withMappingStrategy(strategy).withSeparator(',').build();
    
           }
    
           @Override
    
           public void write(List<? extends BaseDTO> items) throws Exception {
    
                          List<Statement> settlementList = new ArrayList<Statement>();
    
                          for (int i = 0; i < items.size(); i++) {
    
                                        BaseDTO baseDTO = items.get(i);
    
                                        settlementList.addAll(baseDTO.getStatementList());
    
                          }
    
                          beanToCsv.write(settlementList);
    
                          writer.flush();
    
           }
    
           @PreDestroy
    
           @Override
    
           public void close() throws IOException {
    
                          writer.close();
    
           }
    

    }


推荐阅读