首页 > 解决方案 > 如何在 Dense 或 Flatten 层之后应用 Conv1D:ValueError:形状 (1, 1, 3) 和 (1, 1) 不兼容

问题描述

如何在密集或扁平层之后应用 conv1D 层?

它给出的错误如下:

ValueError:形状 (1, 1, 3) 和 (1, 1) 不兼容。

数据集不是时间序列。 请不要建议更改图层的位置。输入数据有 1000 行和 50 个特征。输出 y 是多类别 [0,1,2]。

这是一个示例代码:

from keras.layers import Flatten
from keras.layers.convolutional import Conv1D
from keras.layers.convolutional import MaxPooling1D
from keras.models import Sequential
from keras.layers import Dense
from keras.layers import Dropout
from keras.utils import to_categorical
import numpy as np
from sklearn.model_selection import train_test_split
import tensorflow as tf

tf.get_logger().setLevel('ERROR')

verbose, epochs, batch_size = 0, 10, 1

x=np.random.randint(-10,10,(1000,50,1)).astype(float)
y=np.random.randint(0,3,(1000,1,1))

train_x, test_x, train_y, test_y = train_test_split(x, y, test_size=0.15, random_state=17)
train_y = to_categorical(train_y)
test_y = to_categorical(test_y)
n_features, n_outputs = train_x.shape[1], train_y.shape[1]
          
model = Sequential()
model.add(Dense(n_features, activation= 'relu'))
model.add(Conv1D(filters=64, kernel_size=3, activation='relu'))
model.add(Dropout(0.5))
model.add(MaxPooling1D(pool_size=2))
model.add(Flatten())
model.add(Dense(10, activation= 'relu'))
model.add(Dropout(0.2))
model.add(Dense(5, activation= 'relu'))
model.add(Dropout(0.2))
model.add(Dense(n_outputs, activation='softmax'))

t=time.time()
model.compile(loss='categorical_crossentropy', optimizer='adam', metrics=['accuracy'])
history=model.fit(train_x, train_y, epochs=epochs, batch_size=batch_size, verbose=verbose)
_, accuracy = model.evaluate(test_x, test_y, batch_size=batch_size, verbose=verbose)


print(accuracy)

标签: pythontensorflow

解决方案


我调试了你的问题,问题不在于你发布的问题的密集层之后应用的 conv1D,而是你的最后一层。

当您进行多类分类时,您的输出层应该是3您的情况下的类数。

你的输出train_y也是形状而不是test_y形状,即3D2D(batch_size, num_classes)

因此,一旦你重塑你的train_yandtest_yn_outputs在最后一层改变它,它就会为你工作。为方便起见,我粘贴下面的代码。我已经检查了代码并且它正在工作。

from keras.layers import Flatten
from keras.layers.convolutional import Conv1D
from keras.layers.convolutional import MaxPooling1D
from keras.models import Sequential
from keras.layers import Dense
from keras.layers import Dropout
from keras.utils import to_categorical
import numpy as np
from sklearn.model_selection import train_test_split
import tensorflow as tf

tf.get_logger().setLevel('ERROR')

verbose, epochs, batch_size = 0, 10, 1

x=np.random.randint(-10,10,(1000,50,1)).astype(float)
y=np.random.randint(0,3,(1000,1,1))

train_x, test_x, train_y, test_y = train_test_split(x, y, test_size=0.15, random_state=17)
train_y = to_categorical(train_y)
test_y = to_categorical(test_y)
n_features, n_outputs = train_x.shape[1], train_y.shape[1]
          
train_y = train_y.reshape((train_y.shape[0], train_y.shape[2]))
print(train_y.shape)

test_y = test_y.reshape((test_y.shape[0], test_y.shape[2]))
print(test_y.shape)

model = Sequential()
model.add(Dense(n_features, activation= 'relu'))
model.add(Conv1D(filters=64, kernel_size=3, activation='relu'))
model.add(Dropout(0.5))
model.add(MaxPooling1D(pool_size=2))
model.add(Flatten())
model.add(Dense(10, activation= 'relu'))
model.add(Dropout(0.2))
model.add(Dense(5, activation= 'relu'))
model.add(Dropout(0.2))
model.add(Dense(3, activation='softmax'))



t=time.time()
model.compile(loss='categorical_crossentropy', optimizer='adam', metrics=['accuracy'])

history=model.fit(train_x, train_y, epochs=epochs, batch_size=batch_size, verbose=verbose)
_, accuracy = model.evaluate(test_x, test_y, batch_size=batch_size, verbose=verbose)


print(accuracy)

推荐阅读