node.js - Upload large files properly using AWS lambda and S3 (with existing limits)
问题描述
Current limit of AWS Lambda on post/put size request is 6mb Current limit of S3 on multipart upload is 5mb chunk size
5mb encoded file in upload request actually occupy more than 6mb, so the solution to upload chunks straightforward to lambda functions doesn't work.
How properly implement large files uploading (more than 5mb)?
After some struggles I come across to solution to use Object Expiration rules as temp files for uploading sub chunks first then after sub chunk is uploaded I upload normal chunk with provided sub chunk id to fetch it from temp files and then concatenate in lambda function uploading chunk with sub chunk to make one part at least 5mb. This solution is working, but I still have to wait when sub chunk being uploaded, so fully parallel multipart uploading solution will not work this way (we still have to wait when sub chunks will uploaded).
Any better ideas how to solve such problem with limits of S3 and AWS Lambda?
解决方案
推荐阅读
- c - Merkle 树 - 设置数据并比较结果
- c++ - 运行 C++ 代码时出现分段错误 SIGSEGV
- python - django 和 vue.js 生产错误:在 django 中运行 npm build 和链接静态文件后没有显示
- decimal - 我们可以用 8 位(加上十进制)计算器进行多少次运算?
- raku - 在给定位置拆分字符串
- xslt - 多部分消息 (BizTalk) 的 XSLT 1.0 帮助
- android - 从 Android Studio 风格设置 process.env 变量
- php - 如何在 PHP 中保持在同一页面上的同时更新会话变量?
- php - 我需要在 php 中检查具有相同对象或多个对象的数组
- php - 在没有插件的情况下自定义 WooCommerce 产品数据标签 - 重量