首页 > 解决方案 > Keras 调谐器:使用的层数与报告的层数不匹配

问题描述

使用 Keras Tuner 网站上的示例,我编写了简单的调优代码

base_model = tf.keras.applications.vgg16.VGG16(input_shape=IMG_SHAPE,
                                              include_top=False, 
                                              weights='imagenet')
base_model.trainable = False

def build_model(hp):
    model = tf.keras.Sequential();
    model.add(base_model);

    for i in range(hp.Int('num_layers', 1, 2)):
        model.add(tf.keras.layers.Conv2D(filters=hp.Int('Conv2D_' + str(i),
            min_value=32,
            max_value=512,
            step=32),
            kernel_size=3, activation='relu'));
        model.add(tf.keras.layers.Dropout(hp.Choice('rate', [0.3, 0.5])));

    model.add(tf.keras.layers.GlobalAveragePooling2D());
    model.add(tf.keras.layers.Flatten());
    model.add(tf.keras.layers.Dropout(0.2));
    model.add(tf.keras.layers.Dense(5, activation='softmax'));

    model.compile(optimizer=tf.keras.optimizers.RMSprop(hp.Choice('learning_rate', [1e-4, 1e-5])),
        loss='categorical_crossentropy',
        metrics=['accuracy']);

    return model


epochs = 2
callback = tf.keras.callbacks.EarlyStopping(monitor='val_loss', patience=3)

tuner = RandomSearch(
    build_model,
    objective='val_accuracy',
    max_trials=24,
    executions_per_trial=1,
    directory=LOG_DIR);

tuner.search_space_summary();

tuner.search(train_generator,
             callbacks=[callback],
             epochs = epochs,
             steps_per_epoch = train_generator.samples // BATCH_SIZE,
             validation_data = valid_generator,
             validation_steps = valid_generator.samples // BATCH_SIZE,
             verbose = 1);

tuner.results_summary();
models = tuner.get_best_models(num_models=2);

但是,当我使用不同数量的层运行它时,它显示报告的层数与 num_layers 的值不匹配。例如,它报告了三个 Conv2D 层,但它显示 num_layers 为 1。为什么?

[Trial summary]
 |-Trial ID: 79cd7bb6146b4c243eb2bc51f19985de
 |-Score: 0.8444444537162781
 |-Best step: 0
 > Hyperparameters:
 |-Conv2D_0: 448
 |-Conv2D_1: 448
 |-Conv2D_2: 512
 |-learning_rate: 0.0001
 |-num_layers: 1
 |-rate: 0.5

标签: pythontensorflowkerasdeep-learningkeras-tuner

解决方案


到目前为止看到的任何超参数都将显示在摘要中,这意味着一旦运行了包含三层的试验,所有后续摘要都将包含三层大小。这并不意味着它使用了所有三层,这一点由num_layers: 1本次特定试验的印刷品表明。

有关详细信息,请参见此处的 omalleyt12 的帖子: https ://github.com/keras-team/keras-tuner/issues/66#issuecomment-525923517


推荐阅读