首页 > 解决方案 > 如何在keras中使用GRU之前的时间分布?

问题描述

当我建立一个模型 CNN 时,输入维度是 (none,100,100,1),输出是 (400*1) 但是当我运行我的模型时,会发生一些错误,这是我的模型:

visible_image1= Input(shape=(100,100,1))
conv_1=Conv2D(filters = 64, kernel_size = (5,5),padding = 'Same',  
             )(visible_image1)
BatchNor_1=BatchNormalization()(conv_1)
relu_1=LeakyReLU(0.2)(BatchNor_1)
pool_1=(MaxPool2D(pool_size=(3,3), strides=(3,3)))(relu_1)
conv_2=Conv2D(filters = 128, kernel_size = (5,5),padding = 'Same', 
             )(pool_1)
BatchNor_2=BatchNormalization()(conv_2)
relu_2=LeakyReLU(0.2)(BatchNor_2)
conv_3=Conv2D(filters = 128, kernel_size = (5,5),padding = 'Same', 
             )(relu_2)
BatchNor_3=BatchNormalization()(conv_3) 
relu_3=LeakyReLU(0.2)(BatchNor_3)  

conv_4=Conv2D(filters = 256, kernel_size = (5,5),padding = 'Same', 
             )(relu_3) 
BatchNor_4=BatchNormalization()(conv_4)  

conv_5=Conv2D(filters = 256, kernel_size = (5,5),padding = 'Same', 
             )( BatchNor_3) 
BatchNor_5=BatchNormalization()(conv_5) 
add_1=Add()([BatchNor_4,BatchNor_5])
relu_4=LeakyReLU(0.2)(add_1)  
conv_6=Conv2D(filters = 128, kernel_size = (5,5),padding = 'Same', 
             )(relu_4)
BatchNor_6=BatchNormalization()(conv_6)  
relu_5=LeakyReLU(0.2)(BatchNor_6) 
conv_7=Conv2D(filters = 128, kernel_size = (5,5),padding = 'Same', 
             )(relu_5)  
BatchNor_7=BatchNormalization()(conv_7) 
relu_6=LeakyReLU(0.2)(BatchNor_7)
conv_8=Conv2D(filters = 256, kernel_size = (5,5),padding = 'Same', 
             )(relu_6) 
BatchNor_8=BatchNormalization()(conv_8)  
add_2=Add()([BatchNor_8, relu_4])
relu_7=LeakyReLU(0.2)(add_2)
conv_9=Conv2D(filters = 128, kernel_size = (5,5),padding = 'Same', 
             )(relu_7)               
BatchNor_9=BatchNormalization()(conv_9) 
relu_8=LeakyReLU(0.2)(BatchNor_9)
conv_10=Conv2D(filters = 128, kernel_size = (5,5),padding = 'Same', 
             )(relu_8) 
BatchNor_10=BatchNormalization()(conv_10) 
relu_9=LeakyReLU(0.2)(BatchNor_10)
conv_11=Conv2D(filters = 256, kernel_size = (5,5),padding = 'Same', 
             )(relu_9)
BatchNor_11=BatchNormalization()(conv_11) 
add_3=Add()([BatchNor_11, relu_7])
relu_10=LeakyReLU(0.2)(add_3)
time_1=TimeDistributed(Dense(256))(relu_10)   #
gru_1=GRU(256, return_sequences=True)(time_1)
flatten_1 = Flatten()(gru_1)  
fc_1=Dense(3000,activation = "relu")(flatten_1)
fc_2=Dense(1000,activation = "relu")(fc_1)
fc_3=Dense(401,activation = "softmax")(fc_2)

错误:

Input 0 is incompatible with layer gru_3: expected ndim=3, found ndim=4

据我所知,relu_10 输出昏暗为 (none 33 33 256) 并且在时间分布之后,维度应该是 3D,因为 gru 层应该有 3D 输入,我的问题是如何制作维度,因为时间分布层之后是 3D?

timedistributed 的作用是什么?

标签: pythontensorflowkeras

解决方案


I just want to repeat the result of the article"enter link description here",but I cannot fix the problem "Timedistributed", I wonder What else layer was used in this article? GRU is just used to find the relation between the input data. I really want to know what is wrong with mu code ,because I am a new leaner about Kears,I have beeing coding for more than two weeks .


推荐阅读