首页 > 解决方案 > Keras 中的 waveGAN - 卷积网络层中的维度

问题描述

我正在尝试使用 Keras 重写 waveGAN。(https://github.com/chrisdonahue/wavegan/)这是我目前所拥有的:

发电机:

def defineGen(Gin, d = 1, lr = 1e-3):

    shapes = [d*x for x in [256,16,8,4,2,1]]

    x = Dense(shapes[0])(Gin)
    x = Reshape((1,16,16))(x)
    x = Activation('relu')(x)

    x = Conv2DTranspose(25,(shapes[1],shapes[2]),padding='same')(x)
    x = Activation('relu')(x)

    x = Conv2DTranspose(25,(shapes[2],shapes[3]),padding='same')(x)
    x = Activation('relu')(x)

    x = Conv2DTranspose(25,(shapes[3],shapes[4]),padding='same')(x)
    x = Activation('relu')(x)

    x = Conv2DTranspose(25,(shapes[4],shapes[5]),padding='same')(x)
    x = Activation('relu')(x)

    x = Conv2DTranspose(25,(shapes[5],1),padding='same')(x)
    G_out = Activation('tanh')(x)

    G = Model(inputs=[Gin],outputs=G_out)
    optimizer = SGD(lr =lr)

    G.compile(loss = 'binary_crossentropy',optimizer=optimizer)

    return G, G_out

G_in1 = Input(shape=[None,100])
G, G_out = defineGen(G_in1)
G.summary()

鉴别器:

def defineDisc(Din, d = 1, lr = 1e-3):
    shapes = [d*x for x in [1,2,4,8,16]]

    x = Conv1D(25,kernel_size=(shapes[0]))(Din)
    x = LeakyReLU(alpha=0.1)(x)

    # phase shuffle - not implemented yet

    x = Conv1D(25,(shapes[1]),strides=4)(x)
    x = LeakyReLU(alpha=0.1)(x)

    # phase shuffle - not implemented yet

    x = Conv1D(25,(shapes[2]),strides=4)(x)
    x = LeakyReLU(alpha=0.1)(x)

    # phase shuffle - not implemented yet

    x = Conv1D(25,(shapes[3]),strides=4)(x)
    x = LeakyReLU(alpha=0.1)(x)

    # phase shuffle - not implemented yet

    x = Conv1D(25,(shapes[4]),strides=4)(x)
    x = LeakyReLU(alpha=0.1)(x)

    x = Reshape((256))(x)

    Dout = Dense(256)(x)

    D = Model(inputs=[Din],outputs = Dout)
    D.compile(loss="binary_crossentropy", optimizer=dopt)

    return D, Dout

Din = Input(shape=[16384])
D, D_out = defineDisc(Din)
D.summary()

我给卷积层的这些维度是否正确?如果有张量流和卷积网络经验的人可以在这里提供一些见解,那将会很有帮助。

标签: pythontensorflowkerastf.keras

解决方案


推荐阅读