首页 > 解决方案 > “Batch_Normalization”对象没有属性“Output_Node”

问题描述

我试图通过遵循此GitHub 存储库中提供的代码来实现 DarkNet ,用于对 Swish 和 CIFAR 10 数据集的另一个自定义激活函数进行基准测试。

代码部分:

def conv2d_unit(x, filters, kernels, strides=1):
    """Convolution Unit
    This function defines a 2D convolution operation with BN and LeakyReLU.
    # Arguments
        x: Tensor, input tensor of conv layer.
        filters: Integer, the dimensionality of the output space.
        kernels: An integer or tuple/list of 2 integers, specifying the
            width and height of the 2D convolution window.
        strides: An integer or tuple/list of 2 integers,
            specifying the strides of the convolution along the width and
            height. Can be a single integer to specify the same value for
            all spatial dimensions.
    # Returns
            Output tensor.
    """
    x = Conv2D(filters, kernels,
               padding='same',
               strides=strides,
               activation='linear',
               kernel_regularizer=l2(5e-4))(x)
    x = BatchNormalization()(x)
    #x=tf.layers.batch_normalization(x)
    x = mish(x)

    return x


def residual_block(inputs, filters):
    """Residual Block
    This function defines a 2D convolution operation with BN and LeakyReLU.
    # Arguments
        x: Tensor, input tensor of residual block.
        kernels: An integer or tuple/list of 2 integers, specifying the
            width and height of the 2D convolution window.
    # Returns
        Output tensor.
    """
    x = conv2d_unit(inputs, filters, (1, 1))
    x = conv2d_unit(x, 2 * filters, (3, 3))
    x = add([inputs, x])
    x = mish(x)

    return x


def stack_residual_block(inputs, filters, n):
    """Stacked residual Block
    """
    x = residual_block(inputs, filters)

    for i in range(n - 1):
        x = residual_block(x, filters)

    return x


def darknet_base(inputs):
    """Darknet-53 base model.
    """

    x = conv2d_unit(inputs, 32, (3, 3))

    x = conv2d_unit(x, 64, (3, 3), strides=2)
    x = stack_residual_block(x, 32, n=1)

    x = conv2d_unit(x, 128, (3, 3), strides=2)
    x = stack_residual_block(x, 64, n=2)

    x = conv2d_unit(x, 256, (3, 3), strides=2)
    x = stack_residual_block(x, 128, n=8)

    x = conv2d_unit(x, 512, (3, 3), strides=2)
    x = stack_residual_block(x, 256, n=8)

    x = conv2d_unit(x, 1024, (3, 3), strides=2)
    x = stack_residual_block(x, 512, n=4)

    return x


def darknet():
    """Darknet-53 classifier.
    """
    inputs = Input(shape=(32, 32, 3))
    x = darknet_base(inputs)

    x = GlobalAveragePooling2D()(x)
    x = Dense(100, activation='softmax')(x)

    model = Model(inputs, x)

    return model

自定义激活函数层示例:

def mish(x):
  return tf.keras.layers.Lambda(lambda x: x*K.sigmoid(x))(x)

但是,在编译并尝试训练模型时,我收到以下错误:

AttributeError:“BatchNormalization”对象没有属性“outbound_nodes”

当我尝试用 tensorflow Batch Normalization 层替换 Batch Normalization keras 层时,错误消失了,但是现在 Conv2D 层弹出了相同的错误。

AttributeError:“Conv2D”对象没有属性“outbound_nodes”

Keras 版本:'2.2.4' TensorFlow 版本:'1.13.1'

所有代码都在 Google Colab 上运行。

我是否必须用相应的 TF 层替换所有 keras 层?

这是 Keras/Tensorflow 冲突错误吗?

标签: pythontensorflowconv-neural-networkgoogle-colaboratorybatch-normalization

解决方案


推荐阅读