首页 > 解决方案 > 直接从 S3 读取预训练的拥抱脸转换器

问题描述

加载一个Huggingface 预训练的 Transformer 模型似乎需要您将模型保存在本地(如此所述),这样您只需将本地路径传递给您的模型和配置:

model = PreTrainedModel.from_pretrained('path/to/model', local_files_only=True)

当模型存储在 S3 上时,这可以实现吗?

标签: amazon-s3huggingface-transformers

解决方案


回答我自己的问题......(显然鼓励

我使用一个临时文件 ( NamedTemporaryFile) 实现了这一点,它可以解决问题。我希望找到一个内存解决方案(即BytesIO直接传入 to from_pretrained),但这需要transformers代码库的补丁

import boto3 
import json 

from contextlib import contextmanager 
from io import BytesIO 
from tempfile import NamedTemporaryFile 
from transformers import PretrainedConfig, PreTrainedModel 
  
@contextmanager 
def s3_fileobj(bucket, key): 
    """
    Yields a file object from the filename at {bucket}/{key}

    Args:
        bucket (str): Name of the S3 bucket where you model is stored
        key (str): Relative path from the base of your bucket, including the filename and extension of the object to be retrieved.
    """
    s3 = boto3.client("s3") 
    obj = s3.get_object(Bucket=bucket, Key=key) 
    yield BytesIO(obj["Body"].read()) 
 
def load_model(bucket, path_to_model, model_name='pytorch_model'):
    """
    Load a model at the given S3 path. It is assumed that your model is stored at the key:

        '{path_to_model}/{model_name}.bin'

    and that a config has also been generated at the same path named:

        f'{path_to_model}/config.json'

    """
    tempfile = NamedTemporaryFile() 
    with s3_fileobj(bucket, f'{path_to_model}/{model_name}.bin') as f: 
        tempfile.write(f.read()) 
 
    with s3_fileobj(bucket, f'{path_to_model}/config.json') as f: 
        dict_data = json.load(f) 
        config = PretrainedConfig.from_dict(dict_data) 
 
    model = PreTrainedModel.from_pretrained(tempfile.name, config=config) 
    return model 
     
model = load_model('my_bucket', 'path/to/model')

推荐阅读