首页 > 解决方案 > keras access layer parameter of pre-trained model to freeze

问题描述

I saved an LSTM with multiple layers. Now, I want to load it and just fine-tune the last LSTM layer. How can I target this layer and change its parameters?

Example of a simple model trained and saved:

model = Sequential()
# first layer  #neurons 
model.add(LSTM(100, return_sequences=True, input_shape=(X.shape[1], 
X.shape[2])))
model.add(LSTM(50, return_sequences=True))
model.add(LSTM(25))
model.add(Dense(1))
model.compile(loss='mae', optimizer='adam')

I can load and retrain it but I can't find a way to target specific layer and freeze all the other layers.

标签: pythonmachine-learningkeraslstmkeras-layer

解决方案


一个简单的解决方案是命名每一层,即

model.add(LSTM(50, return_sequences=True, name='2nd_lstm'))

然后,在加载模型时,您可以遍历层并冻结与名称条件匹配的层:

for layer in model.layers:
    if layer.name == '2nd_lstm':
        layer.trainable = False

然后您需要重新编译模型以使更改生效,之后您可以像往常一样恢复训练。


推荐阅读