首页 > 解决方案 > 带有keras的卷积神经网络给出错误,UnboundLocalError:分配前引用的局部变量'a'

问题描述

我在下面写代码,但是每次都在“UnboundLocalError:赋值前引用的局部变量'a'”下面给出错误,我使用了keras.layers.BatchNormalization(),编程给了我这个错误。我该怎么办?怎么了?

def make_CNN_model():

    model = Sequential()
    # input layer transformation (BatchNormalization + Dropout)
    model.add(layers.BatchNormalization(name='inputlayer',input_shape=(28,28,1)))
    model.add(layers.Dropout(name='Droupout_inputlayer',rates=0.3))

    # convolutional layer (Conv2D + MaxPooling2D + Flatten + Dropout)
    model.add(layers.Conv2D(filiters=32,activation='relu', name="Convoluationlayer_1",kernal_size=(3,3),border_mode='same'))
    model.add(layers.MaxPooling2D(name='MaxPooling_1'))
    model.add(layers.Flatten(name="Flaten_1"))
    model.add(layers.Dropout(rate=0.3))

    # fully connected layer (Dense + BatchNormalization + Activation + Dropout)
    model.add(layers.Dense(name="FullyConnectedLayer_1",units=50))
    model.add(layers.BatchNormalization())
    model.add(layers.Activation('relu'))
    model.add(layers.Dropout(rate=0.3))

    # output layer (Dense + BatchNormalization + Activation)
    model.add(layers.Dense(name = "Outputlayer", units=10))
    model.add(layers.BatchNormalization())
    model.add(layers.Activation('sigmod'))

    return model

model = make_CNN_model()
model.compile(
    optimizer='Adam',
    loss='categorical_crossentropy',
    metrics=['accuracy']
)
summary = model.fit(
    X_train, y_train_onehot,
    batch_size=5000,
    epochs=5,
    validation_split=0.2,
    verbose=1,
    callbacks=[time_summary]
)

标签: pythonkerasdeep-learning

解决方案


我可以看到一些非常明显的拼写错误,例如 'rates' 而不是 'rate' in model.add(layers.Dropout(name='Droupout_inputlayer',rates=0.3))

然后, 'filters' 代替 'filters' 和 'kernal_size' 代替 'kernel_size model.add(layers.Conv2D(filiters=32,activation='relu', name="Convoluationlayer_1",kernal_size=(3,3),border_mode='same'))'

最后,在model.add(layers.Activation('sigmod')).

我在你的代码中没有看到任何变量a,所以如果我是你,我会确保首先修复你的拼写错误,因为它们可能会以某种方式导致这个问题。


推荐阅读