首页 > 解决方案 > Loading objects from Google Drive takes a lot more RAM than creating the same objects at runtime in Colab

问题描述

I am using Colab Pro+ to create a population of GANs. I save this population using torch.save(). While the GANs exist in the runtime environment, they do not take up very much RAM, but when I reload them from Google Drive during a later session, they explode my RAM, taking up like 40x the memory they did before.

Does anyone have an idea why? Thanks!

Here is the code I use to save and load the GANs:


def saveEntirePopulation(keyPath, population):
    for ind, gan in enumerate(population):
        torch.save({
            'gan': gan },
            keyPath + 'population_' + str(ind) )
        

def saveOneGan(keyPath, gan, index):
    torch.save({
        'gan': gan },
        keyPath + 'population_' + str(index))


def loadPopulation(keyPath, popSize):
    popDictArr = []
    popArr = []
    for ind in range(popSize):
        popDictArr.append(torch.load(keyPath + 'population_' + str(ind)))
        popArr.append(popDictArr[ind]['gan'])
        popArr[ind].generator.train()
        popArr[ind].discriminator.train()
    return popArr

标签: pytorchgoogle-colaboratoryram

解决方案


推荐阅读