首页 > 解决方案 > 如何在 keras 模型架构中实现 tensorflow2 层 tf.nn.conv1d_transpose?

问题描述

我需要使用 keras 还没有的 Transpose Conv1D 层,但是 tensorfow2 可以。到目前为止,我只能在 keras 中编码。有没有办法直接在 keras 模型中实现 tf.nn.conv1d_transpose 层以及其他 keras 层?

请提供一些示例代码。

标签: tensorflowkerasdeep-learningconv-neural-network

解决方案


请参考示例代码以在 keras Sequential 模型中添加 tf.nn.conv1d_transpose

%tensorflow_version 1.x

# Importing dependency
import tensorflow as tf
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Conv1D, MaxPooling1D, Dropout, BatchNormalization, Lambda

# Create a sequential model
model = Sequential()

x=input=[None,256,16]

def conv1d_transpose(x):
    return tf.nn.conv1d_transpose(x, filters=[3.0,8.0,16.0], output_shape=[100, 1024, 8], strides=(4), padding="SAME")

model.add(Conv1D(32,250,padding='same',input_shape=(1500,9)))
model.add(MaxPooling1D(2))
model.add(Dropout(0.5))
model.add(BatchNormalization())
model.add(Lambda(conv1d_transpose, name='conv1d_transpose'))

# Display Model 
model.summary()

输出:

Model: "sequential"
    _________________________________________________________________
    Layer (type)                 Output Shape              Param #   
    =================================================================
    conv1d (Conv1D)              (None, 1500, 32)          72032     
    _________________________________________________________________
    max_pooling1d (MaxPooling1D) (None, 750, 32)           0         
    _________________________________________________________________
    dropout (Dropout)            (None, 750, 32)           0         
    _________________________________________________________________
    batch_normalization (BatchNo (None, 750, 32)           128       
    _________________________________________________________________
    conv1d_transpose (Lambda)    (100, 1024, 8)            0         
    =================================================================
    Total params: 72,160
    Trainable params: 72,096
    Non-trainable params: 64
    _________________________________________________________________

推荐阅读