首页 > 解决方案 > Keras 双向层的序列长度是多少?

问题描述

如果我有:

        self.model.add(LSTM(lstm1_size, input_shape=(seq_length, feature_dim), return_sequences=True))
        self.model.add(BatchNormalization())
        self.model.add(Dropout(0.2))

然后 myseq_length指定我想一次处理多少个数据片。如果重要的话,我的模型是一个序列到序列(相同大小)。

但如果我有:

        self.model.add(Bidirectional(LSTM(lstm1_size, input_shape=(seq_length, feature_dim), return_sequences=True)))
        self.model.add(BatchNormalization())
        self.model.add(Dropout(0.2))

那么这是否使序列大小增加了一倍?或者在每个时间步,它是否seq_length / 2在那个时间步之前和之后?

标签: pythonkerasneural-networklstmbidirectional

解决方案


使用双向 LSTM 层对序列长度没有影响。我使用以下代码对此进行了测试:

from keras.models import Sequential
from keras.layers import Bidirectional,LSTM,BatchNormalization,Dropout,Input

model = Sequential()
lstm1_size = 50
seq_length = 128
feature_dim = 20
model.add(Bidirectional(LSTM(lstm1_size, input_shape=(seq_length, feature_dim), return_sequences=True)))
model.add(BatchNormalization())
model.add(Dropout(0.2))

batch_size = 32

model.build(input_shape=(batch_size,seq_length, feature_dim))

model.summary()

这导致以下双向输出

_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
bidirectional_1 (Bidirection (32, 128, 100)            28400     
_________________________________________________________________
batch_normalization_1 (Batch (32, 128, 100)            400       
_________________________________________________________________
dropout_1 (Dropout)          (32, 128, 100)            0         
=================================================================
Total params: 28,800
Trainable params: 28,600
Non-trainable params: 200

无双向层:

_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
lstm_1 (LSTM)                (None, 128, 50)           14200     
_________________________________________________________________
batch_normalization_1 (Batch (None, 128, 50)           200       
_________________________________________________________________
dropout_1 (Dropout)          (None, 128, 50)           0         
=================================================================
Total params: 14,400
Trainable params: 14,300
Non-trainable params: 100
_________________________________________________________________

推荐阅读