首页 > 解决方案 > 使用nodejs中的流将多个文件写入http响应

问题描述

我有一组文件,我必须将它们打包到一个 gzip 存档中,并通过 http 响应即时发送它们。这意味着我无法将整个文件存储在内存中,但我必须将它们同步管道传输到 tar.entry 中,否则一切都会中断。

const tar = require('tar-stream'); //lib for tar stream
const { createGzip } = require('zlib'); //lib for gzip stream

//large list of huge files.
const files = [ 'file1', 'file2', 'file3', ..., 'file99999' ];
...

//http request handler:
const pack = tar.pack(); //tar stream, creates .tar
const gzipStream = createGzip(); //gzip stream so we could reduce the size

//pipe archive data trough gzip stream
//and send it to the client on the fly
pack.pipe(gzipStream).pipe(response);

//The issue comes here, when I need to pass multiple files to pack.entry
files.forEach(name => {
    const src = fs.createReadStream(name);    //create stream from file
    const size = fs.statSync(name).size;      //determine it's size
    const entry = pack.entry({ name, size }); //create tar entry

    //and this ruins everything because if two different streams
    //writes smth into entry, it'll fail and throw an error
    src.pipe(entry);
});

基本上我需要管道完成发送数据(smth like await src.pipe(entry);),但nodejs中的管道不这样做。那么有什么办法可以绕过它吗?

标签: node.jsstreamhttpresponselarge-files

解决方案


没关系,在这种情况下不要使用 forEach


推荐阅读