首页 > 解决方案 > NodeJS Sharp节点包内存消耗问题

问题描述

我正在开发一个sharp node package用于调整 JPEG/JPG 图像大小的 NodeJS 项目。

问题是 Node 进程不断将处理后的文件添加到内存中并且从不释放它。

结果,每个请求消耗的内存量都会增加,并且它永远不会被释放。

调试应用程序后,我意识到尖锐的 toBuffer API导致了这个问题。

我尝试通过创建一个custom writable stream和管道来使用替代解决方案,sharp Duplex stream但它最终遇到了同样的问题。

我不明白我是否在这里遗漏了任何东西或者这是一个错误。

分享下面的代码(我删除了不需要的代码以使其紧凑) -

const { Writable } = require("stream");
const { createServer } = require("http");
const { readFileSync } = require("fs");
const sharp = require("sharp");

async function resizeJpeg(input_buffer) {
    // initialise file, the response object
    const file = { normalisedImage: null, originalImage: null };
    // initialise sharp instance using original image buffer
    let image = sharp(input_buffer);
    // set the original image metadata
    file.originalImage = await image.metadata();
    file.originalImage.quality = 85;

    // generate buffer using sharp resize API with default quality & dimensions.
    // ############### THIS IS WHERE MEMORY CONSUMPTION OCCURS ###############

    // APPROACH 1 (SHARP toBuffer API)
    const buffer = await image.resize(2000, 798).jpeg({ quality: 85 }).toBuffer();
    // APPROACH 1 ENDS 

    // APPROACH 2 (CUSTOM WRITABLE STREAM)
    // const buffer = await sharpToBuffer(image.resize(2000, 798).jpeg({ quality: 85 }));
    // APPROACH 2 ENDS

    // set resized image metadata
    file.normalisedImage = await sharp(buffer).metadata();
    file.normalisedImage.quality = 85;
    return file;
}

// converts sharp readable stream to buffer using custom writable buffer
async function sharpToBuffer(readable) {
    return new Promise((resolve, reject) => {
        const writable = new WriteStream().on("finish", () => resolve(Buffer.concat(writable.bufferChunks)));
        readable.pipe(writable);
    });
}

// simple writable stream
class WriteStream extends Writable {
    constructor() { super(); this.bufferChunks = [] }
    _write(chunk, encoding, next) { this.bufferChunks.push(chunk); next(); }
}

createServer(async (request, response) => {
    // ignore favicon calls
    if (request.url.indexOf("favicon.ico") > -1) { return response.end(); }
    // trigger resize call and pass buffer of original image file
    const { normalisedImage, originalImage } = await resizeJpeg(readFileSync(`${__dirname}/30mb.jpg`));
    // respond stringified json
    response.end(
        JSON.stringify({
            normalisedImage: { size: normalisedImage.size, width: normalisedImage.width, height: normalisedImage.height, quality: normalisedImage.quality },
            originalImage: { size: originalImage.size, width: originalImage.width, height: originalImage.height, quality: originalImage.quality }
        }, null, 4));
}).listen(3000, () => console.log("server started"));

如您所见,resizeJpeg function这两种方法都实现了。

要完成这项工作,您只需要确保30mb.jpg文件存在于同一目录中。

我使用的图像可以在这里找到

如果您使用的是 linux,top command假设文件名为so.js-

最佳ps -ef | grep 'so.js' | awk '{print $2}' | sed 's/.*/-p &/' | xargs echo -c

标签: node.jsimage-processingmemory-leakssharp

解决方案


推荐阅读