r - R中增加内存限制是否危险?
问题描述
假设您有一个庞大的数据集,并且在运行代码时收到错误消息
Error: cannot allocate vector of size 15.5 Gb in R
As I understand that is that the memory that R uses is much less of the size of the matrix. I read that one solution is to use Hight Performance Servers. A second one is to increase the memory size that R uses.
In this post R memory management / cannot allocate vector of size n Mb they discuss the solution of increasing the usage of R memory.
However, in the thirds answers comment section they say that increasing the R memory limit is a dangerous approach. But why is that?? We simply increase the memory that R uses, what can go bad with that?
解决方案
推荐阅读
- python - BMI计算器不输出Python
- lotus-notes - Notes 文档中的新字段(项目)不知何故消失了。复制问题?
- json.net - 如何使用 json.net 将 JSON 子节点解析为数组?
- git - 我什么时候应该使用“git push --force-if-includes”
- reactjs - 使用框架单元渲染器的 Ag 网格会在任何商店更改时不断重新渲染
- isabelle - 要求澄清伊莎贝尔理论中涉及自然数的理论中明显的实强制的出现
- batch-file - 如何在批处理脚本中找到我的错误?
- sql - Firebird 音译问题:从 utf8 客户端转换为 win1252 db 服务器
- mysql - 找出库存水平变为负数之前的天数
- sql-server - 本周间隔